var/home/core/zuul-output/0000755000175000017500000000000015147273724014541 5ustar corecorevar/home/core/zuul-output/logs/0000755000175000017500000000000015147276720015504 5ustar corecorevar/home/core/zuul-output/logs/kubelet.log.gz0000644000175000017500000201531615147276566020301 0ustar corecorev}ikubelet.log_o[;r)Br'o-n(!9t%Cs7}g/غIs,r.k9Gf̅Eڤ펯_ˎ6Ϸ7+%f?ᕷox[o8W5!Kޒ/h3_.gSeq5v(×_~^ǿq]n>߮}+ԏbś E^"Y^-Vۋz7wH׋0g"ŒGǯguz|ny;#)a "b BLc?^^4[ftlR%KF^j 8DΆgS^Kz۞_W#|`zIlp_@oEy5 fs&2x*g+W4m ɭiE߳Kf>|\@E1%]˜(O)X(6I;Ff"mcI۫d@FNsdxό?2$&tg*Y%\ߘfDP'F%Ab*d@e˛H,љ:72 2ƴ40tr>PYD'vt'oI¢w}o٬owko%gQ(%t#NL֜ eh&Ƨ,RH 4*,!SD 1Ed_wkxdL3F;/u7Taqu5Ոӄp\2dd$YLYG(#?%U?hB\;ErE& SOZXHBWy|iZ~hal\t2Hgb*t--ߖ|Hp(-J C?>:zR{܃ lM6_OފߍO1nԝG?ƥF%QV5pDVHwԡ/.2h{qۀK8yUOdssdMvw`21ɻ]/ƛ"@8(PN_,_0;o_x+Vy<h\dN9:bġ7 -Pwȹl;M@n̞Qj_P\ Q]GcPN;e7Vtś98m1<:|a+.:a4nՒ,]LF0);I$>ga5"f[B[fhT/ɾgm\Sj#3hEEH*Nf äE@O0~y[쾋t=iYhșC 5ܩa!ǛfGtzz*з 55E9Fa?Zk80ݞN|:AОNo;Ⱦzu\0Ac/T%;m ~S`#u.Џ1qNp&gK60nqtƅ": C@!P q]G0,d%1}Uhs;H?)M"뛲@.Cs*H _0:P.BvJ>mIyVVTF% tFL-*$tZm2AČAE9ϯ~ihFf&6,֗&̴+s~x?53!}~Z[F)RH?uvͪ _5l *7h?cF_]CNnW)F5d,0SSNK9ް4:ÒozsB<^+鄌4:B%cXhK I}!5 YM%o<>"ہ)Za@Ι}YJz{ɛr|hxY/O$Zøu32EʉD'MS1}t i:Y`cФIX0$lη˽`!i:ګPSPٔ3@5;ȕ}PkڪH9' |":", 1Ҫ8 %lg&:2JC!Mjܽ#`PJWP4Q2:IGӸۡshN+60#:mufe߿~Y,iǑ wVq*T+ w%fx6 %u̩1hӰc%AYW ZY~a_6_yWf`rVA,f=A}h&VEgl1$9  ֲQ$'dJVE%mT{z`R$77.N|b>harNJ(Bň0ae3V#b,PY0TEu1L/]MTB4$`H6NI\nbǛ*AyA\(u|@ [h-,j7gDTÎ4oWJ$j!fJrH_HI\:U}UE$J @ٚeZE0(8ŋ ϓ{I,($F{ձ7*Oy 6EK( EF #31J8mN .TTF9㕴/5~RxCe,&v3,JE- ZF5%Da,Gܠ*qI@qlG6s푻jÝ$ >8ȕ$eZ1j[h0SH,qf<"${/ksBK}xnwDb%M6:K<~̓9*u᛹Q{FЖt~6S#G1(zr6<ߜ!?U\(0EmG4 4c~J~]ps/9܎ms4gZY-07`-Id,9õ԰t+-b[uemNi_󈛥^g+!SKq<>78NBx;c4<ニ)H .Pd^cR^p_G+E--ۥ_F]a|v@|3p%kzh|k*BBRib\J3Yn|뇱[FfP%M:<`pz?]6laz5`ZQs{>3ư_o%oU׆]YLz_s߭AF'is^_&uUm$[[5HI4QCZ5!N&D[uiXk&2Bg&Ս7_/6v_cd쿽d@eU XyX2z>g8:.⺻h()&nO5YE\1t7aSyFxPV19 ĕi%K"IcB j>Pm[E[^oHmmU̸nG pHKZ{{Qo}i¿Xc\]e1e,5`te.5Hhao<[50wMUF􀍠PV?Yg"ź)\3mf|ܔMUiU|Ym! #'ukMmQ9Blm]TO1ba.XW x6ܠ9[v35H;-]Um4mMrW-k#~fؤϋu_j*^Wj^qM `-Pk.@%=X#|ۡb1lKcj$׋bKv[~"N jS4HOkeF3LPyi︅iWk! cAnxu6<7cp?WN $?X3l(?  'Z! ,Z.maO_Bk/m~ޖ(<qRfR"Au\PmLZ"twpuJ` mvf+T!6Ѓjw1ncuwo':o gSPC=]U҅yY9 &K<-na'Xk,P4+`Þ/lX/bjFO.= w ?>ȑ3n߿z,t s5Z/ Clo-` z?a~b mzkC zFȏ>1k*Dls6vP9hS  ehC.3 @6ijvUuBY hBnb[ Fr#D7ćlA!:X lYE>#0JvʈɌ|\u,'Y˲.,;oOwoj-25Hݻ7 li0bSlbw=IsxhRbd+I]Y]JP}@.供SЃ??w w@KvKts[TSa /ZaDžPAEư07>~w3n:U/.P珀Yaٳ5Ʈ]խ4 ~fh.8C>n@T%W?%TbzK-6cb:XeGL`'žeVVޖ~;BLv[n|viPjbMeO?!hEfޮ])4 ?KN1o<]0Bg9lldXuT ʑ!Iu2ʌnB5*<^I^~G;Ja߄bHȌsK+D"̽E/"Icƀsu0,gy(&TI{ U܋N5 l͖h"褁lm *#n/Q!m b0X3i)\IN˭% Y&cKoG w 9pM^WϋQf7s#bd+SDL ,FZ<1Kx&C!{P|Ռr,* ] O;*X]Eg,5,ouZm8pnglVj!p2֬uT[QyB402|2d5K: `Bcz|Rxxl3{c` 1nhJzQHv?hbºܞz=73qSO0}Dc D]ͺjgw07'㤸z YJ\Hb9Ɖ„2Hi{(2HFE?*w*hy4ޙM^٫wF(p]EwQzr*! 5F XrO7E[!gJ^.a&HߣaaQÝ$_vyz4}0!yܒ栒޹a% Ŋ X!cJ!A\ ?E\R1 q/rJjd A4y4c+bQ̘TT!kw/nb͵FcRG0xeO sw5TV12R7<OG5cjShGg/5TbW > ]~Wޠ9dNiee$V[\[Qp-&u~a+3~;xUFFW>'ǣC~방u)т48ZdH;j a]`bGԹ#qiP(yڤ~dO@wA[Vz/$NW\F?H4kX6)F*1*(eJAaݡ krqB}q^fn 8y7P  GRޠkQn>eqQntq"Occ°NRjg#qSn02DŔw:ؽ 5l)Fa/TTmCԤ{"9b{ywSXE*m#3U ùRIvޏrJ`k|wJKH:O*OKy`( ݢe*{ ua ȻݔhvOkU~OǠI/aǕ-JMX _.6KsjA Qsmd  O#F.Uf28ZAgy>y,d$C?v01q5e.Um>]RLa&r?+@6k&#l)I5_> ` D s5npo}/ؙq #a2V?X~.4O/'|/_|&q̑0dd4>vk 60D _o~[Sw3ckpkpLNa ^j 5*<&}kˢmqvۗj=<Tr=[ a^؃ È(<^=xZb [_tܡ&yЋ{ Sym^?̑sU~' Ԓ f\itu)b>5X -$sn.mjX( Q&ډ}IN/[=3=XH$Ϭ$@8JlE=;IyEJ船/|.1٦@ڗJQ&շy»Q}U-jbkmA B̠YkIP0HM9AieC" >E }qŗ9>2Ȑfk ޜ6UxTHzy ̋&\ByE5]2z!YY5<|!"7 ƕ<>:j +d*Y֐ine kB=dn᠇̕w%Yd.L ׯ/q>o.?u_[eyE~=ʘLRu5x9궩E{tXeHdei>QcF{P_Ou\Lca +x0^Ȼ 4-ɜ^ICz#a]wK ě2{+XCG9*pAG kBc6y'믇JB)aM1!I!c! -k5 \G~jre7r:-`:nR</r`kC%X$ax՞fd>$а/W.kx}TkȘ=tn ǬV`]NZZ 7㘹 <9|AW`| ÚBb Gb2Z5-5`ޜ 'LZy_t<%(eH##BgJ}L%C|EOT"$z@OfoO9_܇ߥ+W{r3Uj(g!{:wDû#YnqE7F0g$=8Jz8kc39YH:^=ֺԕ %&gY'h:MGQ.P{*et>v0_('?JgYg]NzPA0O,L:fٚD)aUD2fI0P23dɻn:POz =c%b Ȝ[LʚC}+zl$Sn =ocRvWDQtIǒe_$3j= TuY76̫, AO,Nm`Y7, 5;pvMX+M]oPxG>*KU[<e@* &%{J×Ӿ|@+`w ˨5\Jfdr7y8J)3xxƷ A5xN =k$*źяo6?/,xov=-= j@D=9%"-Vm#9_)ldX{Qh4jFEb"o;](ePd=Pd#EPrȻ[ayq (x5kRs)KP_r2"5Vzc*'F ߏ_1 {N+K+%@2@M(j5H< / !v"U28 P޿[퇏FɈ;rNb}SHL'MU6'} dgx@fm d0.${2zI*(΢Fo폘z)NiXOovjv8dqsV{ꋺ0d1tWՂ)*~t9Ȗ]zJ(qk5zcH PϪ ;{0BΗ#HyV=I=v Iu 38i= d-:'}~kf "7uϒft;h [FzMռh9k,Uч'njӾa&!0P$$/kN?۞Ay+ ؘGx߆j#z! Ys5w/LC)ng%zcX29Gc4/;oy]qgnN~v|zol])QX[ݬCX;w=)I|ɚ] ~Żif8n-KھnNv^ܶ#x'5X+P9;`@='ó!{W쫮y ڽ%݋hmОo$.`,؃sC=: \d9e|i;{{5 h~Gmm0oq&xCP$3`(1בXvsrMC_ZX'V`h [*a R< :0{ =Dn@$,6vB5_լ3DyZ7 Iy1bM )QڍVdx oNRƣz~Aa=e'U-NsSTbQ=:^goXuLv^z!X+DA u s~wuPzdQ]_2sjTɇ>-G2.mhgc=~Ďs.n[M:b~ ʈaX;}w¦#ol*}!fG0{Yݝ.1H&3-r9Gֈ tAd{6z&D@|ٜ*ڛ:zyd9=}agk.oIm-e4ɔb&\0P #;ɰ>]C֎a0{aKS[1G:<>]9] և_[~-﵌+b*$0oC#6HP yL Nq|W֏#l5uBYpA:<Ó7܌fަ%*׊;`?A ៣tct[}MBu}x{wwa’:=kJZ}9yJ,-Rq:__⇿aYɃB*_46Qjܠ7$xy1 `J(hjnʵ&n+=x8>E| h4[V `JO'| { Qk^K|h>,bj.g_: RTJ{d{IT-UlYBbE% {(8Ihb#5!@7κR+A57ly]Fj{XjⱽΝſ'=T2gYCu6C 0XtQU^N6!;)Ʀ(R"7i 2Oa՗Q?f#O Mt4Y;J)gb)b➉2a1&S>-HE`5=}2ݔڏ%uϦ8sI@6<[q8GݥVXW1~T/t!桇BoX\nhדaH ~@=IWTF+~MujP&qޓ{"3.FIdKG Y)Sjjqo?Oh1+|lNϷQ ۊ*y@:\ WĘs ̧K S \1<)"ۊB&M:ku*acKxMBI_#ǂss_=4T}La4i RRm+0^K#ӊ!,ZDx^F 0U#Q8tjT.҅ 1,j2Y QuH Ύ]n_0̎`7rdWS- (D2.}.D]Or}S? {;Zcqߛ6xm@m=Uyzo%pg/cco6Rs`H' GT5%:zZsb?iJg3X7(l+Cju 0u8j exU}W:Y#/7tLD4jF};qPU/#rR;a+Mqܛ7;qpUۚ5Tnjv;;64I+Oq&wt|RM+T>yub6()ęJآSz5;)Q_EAVح m3 Ō7$j1\7XD&?]\9Ȣg:$1`+vur?]8=%Ml%.İȖb?A,tpʼn)dk!SV nm=\ȁ-2=|5ʥ zi 8$ s8aK4%V\ t!Lku`+=% h&)RO*GUڇvI`b0ο0qoI`b#FOf_$q^!iA9);q`F:E Ec|֨r[RR1҃UE(Av1Xazn2a/"(9/L1X]~r9_7*rn |S.Z K9mUxBa"'4T[^2.9ȱ ] QAO=Ƅ`!1Z."ћV'rX-|_W8ʐ߯C{{K10>"=a'Kн|~+:)tpՉ8Y4cGT~$)*517l;V 6|~AVĴ{3Gl& Kq+%*?V6N{I_P:9Z\"t%>7ES5oà␻,?XAe0bX@ h0v[}Bf*Ih Km|6d61D -':l ܇Bz1U_#GXUE6u 4.ݻ^XڋX6|`zR$@VU^scG"i]qt)ǪCZA^jc5?7Ua,X nJV!; qoz[X=>NSAWE92g u`Y1%rXgs+"sc9| ]>TS"JNرWB-zҦՅu;3OgCX}+20G.@U#=7) ^EUBuYCrçң~0Ac`u0R=ljR!V*Ƅ\순*O]vÞr6 g _k@BS %fee}).~n~%r~MMp!~?~?lkdTc/wIA>px|ݸ燇*WuxxM?]g)EuXr|Z=T*Hmc 6~\i]u]=ݛoVb}y%wRwOתROmqtuO{ߟ+[{_uOq?u-|?WS_tOq?Eu-L_p_Cv .e ϿWѦUt׃wN`4ڄC~ uނ`b duhq[-Nk"-Kj'32Dz O\!f3K.qx):.qS qYқ>W Rl{y :gkE$"YDE֠Z4xK%k.%tLv7Ull- }c| ]| ęjnli˚| Id Z]0hdmD>hB֡#-&tWN ?YN: 3 xH "}C[ kӨAG4eրG&/EV$Ժ?wϰ:@VcyBFƈ?\H(m!?I#bX9nW ՈRcepO}[ s\Ve;]Oq%]X-RÈlб m5^ AjSؒd 3]%j|l#*˟ R ˨aRڛc1w|o*+ŘOi ? lT z+ZU;=eT|X-=҅CrFwT#\b~?/>, `ۢPltdr$i+tHk 3tl7h#3)vɱxMSLjnKȻ \ >ls&}+ uf^b$2[%ֶ/:掔i2lG~ V85FfwtRZ [wB X16`aSK_!;WK_3U8D'+hZ9| !8RO;"w O@C;'>|tL#LjnۖF+B9Vy"UP۾u2Ja>0ta.]1-{K1`HbKV$n}Z+&kv'ˀ*Ead<" ZW?V g>el\) Z.`oqD>tDN _7ct0D f"!!N\-8NJ|r^*A cist{=xJOd[s)t|2M++XGX߇ ѱ LKV:U}NU*7n-zߞ_EAV$4 {%V[niE|nF;9&-I,t*qȎlo㠙ܲ'w'Xq6\X)ى?Nwg>]dt.kam }Bޙ>ǖ_J ZJz܅E9t6FZXsreHhlw+ [Cr:I`+CLع )%ut\2+A!"Lhrٍ L.۪]ӵ sY4Ew`[x,!9V"R1I>aJ` UL'5m1Ԥ:t69| 3Q#tU)16^Yuatdq>*cct/G~- } :OVA#&<6JJ4E88AOec ܎y,()i7-Ո: :!8) B 8dJ`3ccK/\ct!&i㧃$>lĘ_E6=W|$/ -{$1h.$LG^FbKIjdHJ6S zp!m?e8 "(\N+:Y¦&"u8V䝏)@+|dvv?juj`6۾h#ZdXBF{Ɩ;!87Dw-~e;G\2Lة&*cUAN=Vޮ$D >iԸ>U{s]^l`+df^9c5}nZjA%sPX!8/G:GT0^1Yg~?ldTc/w2MjګU%7ߎY<eyw=xU)w7ߋߦ RۘV.nC_}S}/绻y0߬VݽǛEu]>\_W7uw|U!rx%Ϸ?sH΋P4w`[Qj0ւSzOAР]I,Ѥ1H|T=cK'X iצ) e5 gCj4[5Ui/N6H+ȶ!6g ҫ{.9Uʍ.6_~Oz_Z p!>;d9OUo_KR+|t><-BLqWՙsoY5mi^\b``Q3$[+G~HC+ipӘwm_1~lHI4 Nlv],JlH_8i:ulG2glŘUnPƪr.ƃU6e#k$1Z\F$$i,*xϓ-%2VEw jSe6ޓKǪ7\cՅG$0V(O*eQ&ە66BcUJ\..Tތzb,|".q<(aMj>zLx8^ޓi\{ sʫJ NI={:%e+ONp v ^_Xe/Q߶"2Z{C";^WԶXx! ×ŕC,Ǝضx&8<6D!jƩC1WfׅW,+F!z49sħern 3;ur-n #)3 F'QE4|$;N+fK0с7x܈X2$Է_EϦS۴Ʌi_I/!SǴ|55Q_Fe=jAG,(kxb47+צQ~ }zòt$x/}e0޾1>6uL# T#xO=U-/)7ludvQwD4*KXRv7Ē5u;%I\hlG T%}2r۵jf*,V V/z1dt?=ĶwIYjEϬ/ĵu_n@kXfI& B& Q(D>X-d(V9IA|bdLؤ^I\PšTוՇ0OQoXs!&.׳{eTjV]Vv`-UKj]jW޼:IJd$WW-0uSQ&ېR]o=,?*? fJ>0oU2'~tgJJ&qS7~ ^N_=s5 @*r1 v1lCϓ¹zBaK*1??geT729Hb9#ǿ<>WTƼ'i%>tLmYORv 2*D"F,M`W"fi%WrV'Sꎞwk],Ƌ9q̿'MNyޞ7&'oX)NE7,eg&vXR-XV#1-sgDo?dD Ɓ7Aa u'|bi%|tR*eM]BawܝErټBpwq|~i98%lHE_jAGEԔI} OvG֝ 2Ow'}>H.&UzaXvJ Tnџ(,F:R9e93|Ou~ k"5>9~姤O4qvFWDp¡bժdn{wQKy o(`@wEiI}PPz$D}%K 22`J Iq@v9GiF(Z2Z@~6Y[A$ ,COOPǩl[8}Gi3O6"V8}SEA \HQqY+G9 jq_*?A؄cvpu^BTK^/Dty"t8_Mβ$;FwxrV{–K$B{cO]&.λ|c`yq|jsf88tpjU.N2u9)IqJľ޿b@hQ~O>u}(đtNcavw#.*tOyVTed(J䏻RP\^,x((6" 8uLyy.wDyF@Ǭ`a$ KpLT%8y cdȣ )U{M\&s|EV)^-wqSsT:ݺ ELw >k/V+A@!ǁYPutSSɮ>3Fmm+C _5{#shXB훺+/o(;{cۮ6}K5̔W@AATg$Rڂ53"U෮Jf I2_ ~|Gu6m!/:'oN[#(3uaR6yΊj!ݛ]\+O~}0*Srx`N`[G Jp+ H)o]e~ɦ2ԣʤ(n&@Xi5.7"&YRX&R֛9ARƕR^,"s<hQmvfV($ԲHd7(b7w_}[Ab\4M~jI`V2U]tVlup NPH6gP VY.mO\?o vDKXنTeo1>j3u$L/t5cTGk8-_3x<ڒT㰬kOXS9 Agu-ٗz:g:b|T}"v0Xϥ[髼SMs[Hx{fO{- Z769{k F2Q,fH[h3+_χp}GGh(ϪtwCPĉmL lA8~k.o 8DKR5dG ~G#klj;4\e77X]Cap@{;\|X0uOs98VU~?vdOPX0r,*O/ueQ tX*:ljJP\(I*؊j l#.b֤!>1g^ @<%1mkKbSOjK8ɪ5: )K2,t nJa%˫{"&1Lg8ʞ&-`"/7bmK7sYT<餜#(–.$s@KkV-݂bu C{KrKJ*)UXq:L Hq)>7"W>4.sf,dSK! %J,'k,q,)h_Mp-KQ%LžX.ڧ,1$5<*#[W߲]G0L!6wmm*7Ԅ,XFd:/R{mKG7ڛlh +E'Kt<`l9uNNGd0ۅL YN8v]]|bQՅh]D!&T bE|X ;ؖ`OMkat,PZ @s[\ZjKɐ3$kiDC,=YWj\g "V(oi1\PBXWSHӦZl 'z6< 2_8;ZtKGR/!hkq]ϳoIyp]=mֽ\+QEfYF[*.߾jm;yT`Qu D6RMD!kGXĢLVAKŊ%~4B2bnA[͖Xu |>AYHRT氒ql v"ܶm_p(\ݶ,o@֌ #>})ŦAݥnRbNSS˦S 6Dۻ+/J;jQ+)_=zDV ?,ɐ:۰SW͵&]I|(Fgm jZb[V^)3TEC jQ±=o><ܰVVf􏄁\dTKwjLV fqp \*lq<Ǐy80 #?Q<ߍBlj놱L+KEߪp@֫LM=@`P JD̊c1=-2Qש>.4"qvU{o3э.j4GXf ׾S)BUG~J3(]*pu%{J%'RwA[.L..hnNMj)SRӣ[Q ;M3{%Dy[=tyݴA@UWK5u Q5Iឞ^}e%E&fjYToZa+/WT-G$?GAwZNe&U"faVjq#35[fl>}By1QwRS8-|ﻈKN=}jp;kND b׺oq9%W׵S{ ?ԅ ƒR̥N:kKGZJ|oFPm^̳q`R4<@˘J7(DR{0ڜRę3"|&RUBtK5͓1yT޾\O=ӬYξ &J|IogsĤglm! 5z1F `bp1zÍ^0֞(2+oOP* 9ڝ̻>d;@e^qa(5(3X}|*|Ƣ=ޯB4@+ƤqHJnO@s>Z:4߮ I] o۸+;ۤ6;HL&mȒhjK^c=l*:;NmKy(EyrMZO 4+2W~Nt ٸwiғWY$Ey PWJVQ@Dt5+qúD:8`o ѯ2_*pϓ>=px.~]~upNοS 'plhr+t3VACkҸdp44' j,D۹'p)mjl:ZMm^ mE0z'5X ^|5[CD^KOO[$ vs*8mZhX  (A*L`_E A+%sА IM\c.ӔO;[ x42 v'&*bMQr1GwL' h u,ՀZid"QCޮBʹ!_51_a툽(ݢxq~k8Jxf;LJ&o00 usE5R;mKD#1nW@P<7På?`|9^> `jh KZ 9!?S$e- P,LsCcy.oSJ+)l)mPt? sS\zW[CIըh;u(FϦW\۵_։9Go@54tAVg?hdkX4aA 4RK7HlbYl=x9hh^#vQ-! 4tPgFMhdn+ѥ'@i y}6h] Hb xm0qYS릜O]HЖ8zY!]D#Y0{:9՞Cix;69H4ܤX(نp|]X {Æg3ml87 k, |IH!4k; F3&I2^^ JIqCB#xWo4`vKRMئ#FsD#o[x*rZP\=߁giqc6n~+| D3.M< )5GІjDz`~NDG96FgCG@AۣuJ$p fAoοRyj3)RxIbR9Fv&;}ŃI2_g#Ӆk|̯C<˲( 1 )~9XχDeo{2+Tw FwHLjjv(uB#/ 2,`؟#Ex#` iXIAJ16II# @(5.2` |T$|5R%8 Kde%ň+]3;7;0Jӑ}:܀zxj%LkP=% ʫ؋B >VWVݬDL9"|H縞&x4ccAָ|tDv!K؉.8a9HޒjqrIa*}yQ a/Ľ91urblOe-\FkX4$xL, H׷gi<}QJm]c`*QVY; e̾)I%|v<8<;&7Bt&OoDnl0CB97%Lgg$Z:}: 80Kr ӾQ݃$6J4ZFUb RǏGp~;SMnԱxf'MM {IzӻI"[9tPJE{yy=$IJNg}h>5mlj=6+n}!}Ut{t ~!=¹^W 5t%O^p<}CÄ7>9IGEy$fڔf`3 *E[?. Vԟem$ʚ7fW | n"e|2[Y&s -ZSHcԳ<'ј &<ZR2h V>پ>TsGJ- ڞPiZ;jmAHO#ޑP{ BEB-u'yΎ:[.nA=uw$݂PoPo B F#[oO4B 4X$4؂`{BHh9]\l` 8,-}Cyq ds(3Rw 8#Ճa&M} 亘NH|U#]tHQaʔknJ\[V_Dc˳Rj+9,zeO('&Q:CcVpre"Z"G۞:eb?H/=\1.^bFOx,1b D6@GD-@L/+? Gtb)3Rר8S|8S f(-jx!g~s)vטRy'ĺ:[^GEewA&Ix~Ho ]͗}.KI\!«}!V^ }8W.4$< 3s ,֕5!yNVn+%gsy* ۍEd?L3? PS)ÌjP 'lOY 4R1Lyېyb,n#(Y&o7HB:e:.)+6뢚t[I\ڐAjǪ4.dAʪ I6* BŔ,mTX*$GXh,М&za;&TC(֑F_ۖ\iȵLyLH`gh(9zJ8&FPVgA;ߪ5iH)HYmu2+@GIZV"րP=ϗb㚏1yC9WtE1Gns$<ŨAnzҢFWՃ~~ien8\2~C\g=oL.\=IgZiiq=~Mb" ʛKJ$O{J>kr4'(e>wˢB5mZuY%e,xsNftd!-t>4 00 ʥLK\ tV!UzX :,9@e2D}O _K^a |$JJ;2Cxi  G1%]qpSM;|zr%w AHBC{j~3:|6`]ޞjD|6쭀~n>-pMe|/}?gg +VzsE>Cx nFQbQ;]R:/(Hd<WLfCqRX,OBTa{naR^ /7ԜM9߾uߦse﵇ ;"ij(O0u75YsZ+J)N/y9^:~=;3WAk1SnfB9}(^o?=i&gٷ޸zjg?8&b9E3vn.~TZ[_=6ݱ=>w[25矶O.o@ T?&osׂ}/k3d8+>z{]'}>6禵G6K<MYOՕy5g)QʔA(Y_Pd71%_1bNlOMS6Y3x?/+DA0aVaA-jb$>3m%"NڡةD1Y*l%g +pCJZoaV#8Y E~"Q8jex\V842csmNh+Q#ם؄\I3냂Q2-4&R2jiIJ+F/fE%6p2׎*29[cPKtq/yC.#|>(;Lw=cKդ3&}ˠA{8Qmx/1ɂIc 4U8 L W:8͒>݄5HEO$8&.o^LJ">0d8p Lc$'Q 1f9iP3KdO:%$;7{XS]/Xa tM,@.[k,%~ZpTIýܑg(Vy(, Ui İF,bĐ__F/{]ڋX"F/Ĉ!vU_Fjuܶ UXf.K?~$.<+@ENC267-4f+Lg,)p[WDWrp`p2T&p( 14ZkM[4h*^:6}IYUA &˫t,|waΡB’DE&tQ)!$&q|w5##sID%NΝ&QB?'t  D! ݀J\pJ8('i$X='Da}B?PnpF|oM(pz[=|э&sFY*2>Ѓ Lլz\OG㯪6+48]r- gP NY H5753:TLl-}CIiDAT=8r3NO|ӕeZ G&O1}ΓwD"`#+z>yzz QѝƅήD[XB5C#1d ޘ7k}Y!@޻pFpI`A^ vqэ1HDl44)xdpB,wˆaۧ ;uԿj@ӊ0r!}Ss#}؄;toߓ*H \cx ߞ 8 $]6IXg;5!m⎕Z> |ي L#Bey#ګ?jIT>iWhx *u$8Yq*t ,rLEȓpHG#Gs8:>Hl\tJdjSi=Kܽ5υId2~4 RW4źybi*D ξ.m>aTE| х΍Յhcdv,oZ¼)^r.5"$g.BY/bټvi Kmنl??8:m|oWp]mWvf͖x" ?aOف psX$e9pDὙ˻t `4caRb-ch-0bHbp̒ W!s'bN+erKYl̂tkovB`B.8#YMRƇiT)ru*5cПA n ]2M/pk~$z}<YbpnK;x$~UvlkPE?P(s [̙t Wc8"4Wg}ԓ(2 @A }K56"&Ua^>PN(;ڂDvZ1agr:auohn1xÓI78>DVY2^1̓T*ur w49T0Ӧt򉬒(W3v(> +r0R*7 p)uF\TW^rd?2 qxxif@C5nN%c^pԽ<`YiU$D^+Mj1RuP^}tDS@7xHd麧mĺZʁT4w kCLnh$v>5q?اYJ2'CХ\;L55`&rShpTu]ooV$deOL&YZra P*ďK\/4j%\Iܒ4ru&Έ;VB 9.F8VJb,볰3~@ⶄCPuK:ueАUQ`:XI2dȞ$$z8t,E02bóaVǴ6\̏DKI8x/:=c'YFHɱt`V@9v\UqFI"RӀ_YU.oK"_WGT{wg))$QPҮxKzK;a[:N]5{m20cY_ łQ-1alb0.[{oHp<$t@V[>tQfE!<\ SsH`$v &YE"n8U1 ؑ9`AB ]ʮ>O /”pKRwV^OCgbW-6E. TW!1BwOO0\9J=&C!\ :yUe32)փ;^^ts0e^`fp[Ga$e^UO:m/  Ieap.H)fp<{b7w¢B5Cvk#$]$fptY,fE\4 |X yѺmwpZ1}6maةxv,REI%)+igHpT׵Q4rGxώFMA<UcCu'/12; <7y/E9uH\$PX QP06мw`=3sa"4UL ވht;B*`Y~ SOd𤼪uPnk:q$."BbsA\UDCI]g@~t~scȅ č5.,}M,P51lPu漾**( E\ Vw̺1;.FjixFu)3 D1Yٛ:R7`HuHIˊIP#İz Cw |#ҽ.cBx->2X?U3hR`MUL F _z ]_?ެ6twyp jYo2Yk(#JSo`Ya6b$=1W0xie)=z~wĠVel4fVpA6-g0V>#J.dhz&z~V; p@zwM`3^VEB oIUb C Eq]!L971^px A={qܯ!ܴJJʇČ^;ڸ;Ĕ8*C0%]N/(z+>iٰ9^rzÌ>L,ljU=~~?7&C_O7:ꑞv09!<(k ӟȲ۫Vl+"fqymF뻑Bo{87Op= a7sq-R>w<ǵN G L7L}1Pޞ^ *OmJ"QK"\9󳰡9-S޼8/pEi(pE(3#0?8~FonL+{u|0=ۆ/^gt+I_գpI='1&ָ_oV๧LN] @qbq83ʫ N2\t|4Nz7cFv=޺q F<I C8Q%>.A?w G+!_@ ډ;IvPu9R+o/߃;~Bhp2" Tޅ M ?jOfl/nw%϶ Y [`hNfph=D$E|g=n4Ƈ?$rJt{UKTen]=7BTIbTGh W##*Td|,C-pg~7L|,DigȐYhQ*8,UuQ9ЬNkoއ޸ę)ЈcÜuȎMzanԻNT+<~u"Vr} (2: Ι ¼)?]fA2[7hEh(8s&8 d؏6"@ Nden$qWyt-S-1Kexddpo|xE-E /l(zw^\?;:t0jץڕM1+j?ǵc])ݯD[U~!B횒G%E_6$d+،Ī ?E&߽~J ;E qtV\ٸ(`|C>8 A{ӕ5Qӧn|9Jw\LɴX_==M~ou}Χ%_9 Rq kJ eD4Ai6돁$iF2&q kz!Lzy&:wEXB ,Q,!rP(Bl%֎lƬ s$E P\PD &i̚31ĭ n6 B4rS۽CVΥN1_%+k@fRE?c{SF)-%I΋1 R%{ug9\7+b䂫 ,dX %n'Q5ݘZ"9iX뎚0 eJ| 0 "{k$"-)ԭxvϭ'|4$N'XJۏgxx6v{۰ֲ?`42.4C[c ÜTѹwpqFj)$,|ݻ Hu0 A5Jn8knaVϽߓW>+񻦟*#[G Gf5)6OqR"V{uYCg+5959s9re݊0PgXKn68AˌuoLe,|Xl{MN.x2[9Lg=f!,$0K5H- dwa4| ao)xLr"pRb߭a)ڷ[b-,[?VKa@Y1o:15, Fﱆ'Bu >FR%+e~k܌-Taf6V38vTz񫨽7G?6A$ycECU@ 7xe;$7QO)w_?_.Knş>C߳?A!=x7v|Ba'?]ޗu7;5Ԍ?I A)^~ HU|npFqaU`Y5 =:\QB"? 4*T 2ѻk:5A{G~Ԇ;.I@cKEfe\qW SloxلNi&Bp̥ #œ9Ad%eP@Q 2bl\Zl\kF5cfD&PQg:Ì0)3L॑$ !w2S(p|J wUͨQ-ͨf2/$wuR:bj w^h_O`)v\h_O\֒v^h+ƹQBcW(Ū&W7t{l{4A3+6Hp>=A5{0dD(us~ֽAo]Z]Y_/Ms ,tHԉ}(Vj''u;/GSb\gRgeU[?$Xl{'6["_ݠ|T3a֜WYtI6 uZrsO C+կ P)v}:૊G\@ _C qʫ Nkw wO~=jb}ɻ#n\bn~yaz. 2d[YT$ .l˸ %;B'3 ҠMt1x,J?B.$aPx } jT•tT^k zI#1z8U [ dneQE =I= =oR5^6gRA:ȶ'`:f4Z+ Q!aю D,<+mw ٟzʾR( +٘Y "֔Y -S{1K8zڳjŁQaBUC!5N81c^YL@9(CiE NGvGKfQJF+!%nO|?'&&E$דvA ~E-pU\r;Y@u*VUv~1< f3O6* }uw"{`(}}|*aprFy8xQ l3f Ìt,׎y&ΐV"a;eU YU)T1k8j,Z=_j`G`;X]nu'7a`:e.#VL1E!!3pF`2|Rx1e^T'7 q$PGE{I)RJ %B ~ S)FāOIZ Pp7%ΰϲ0+9eZbX!xʡ/b<%f-eeMcg|qbjr4,3`B A%/PzXCF3+vF>NX-5њ4  &iT%d F $CQ  xA8h2թU J/!*aqaN9Ac@)BrR&8)nТ -&Q\OiACZb[}UM,VTS"4RMR'`\HVOɆOK<Ƴh'~ԍj35( qg8!\HNZAaf\JG=Ҫ#8e;1[!)P<5*ɕ!!ʒ2'5S눩k ,ڱZO\J(B]h2f0kLBOhvI}uq/S*aS<uRFKg{mꊱ6 n q[({RNԺI9X%X9R) PI8+;HFk@\a4g;F>#Qc n:7*6UIJ,WRYF:;%R.EYՀ2-8L"\H^YB$L6 Q`]sE}RJsFqtj_vWոTUjxa`vqsZ0=RJ2+φ={JŸFD EC:}tV48OLƻnFw9&vnM ;@X6%/30ԥ)JfY,Qa*2)L(/t&$/ܤgZ\ \m\>Sfy) '0Ɓw=%a/hxI+]'$ĺ|y&9<ь)a( %.9E8զ@Y^L}B¯\HNH쬱]lHF`ۤ*Z:L֎Q&}C \#\ a{kR< )T KOi6# Tf1bx<Pm!MLu՘[ㆂ.r+ND]!ڕǙJ9MCA~I<*O)* 1)ܕ)*d02gPrBn j",6J[-fDcrPF$X*OUxTENe, ˤ(id&bD LɊ4mEF5)VThGUI .q&X"0)1. "*M(&e&5E1Y!ӝT%,%&$0=rr~ LITLܕ8mOqiCO4t)8&0-k]NH]rsn5 )Q >bgDXN).QqW._)WX\l)Ds >5)Ux@LFYHO`8;OZH1QI78!2l ]W_ԡAtpy eh^'ɘҊ.XS'ף7=LS2uC}(WdN=)ůvL./tc.F Թ45?.kC]^S`ß[V~[V &1 Ƅ0cB3Ug%3gʭaG~vS{>Y9{ɪ,n9Hr>jvn֤b"0\Uv.,Ɂ @Yx~lHbN/KC:t߼ex +c$a+;_1w'^u>, >@UY00;깬j0,Vq{@) QWI`|O) '\r,OG]0NBZ,q.;ٴl%ҫfݹ{~hhx긡o%\d &pUvCu&bKDGOL f> =ma;5VsS "VV/3#c=x !r9}kMOOO;OĞuizw@=O^ԇ.j 6I}lNmϧJ4DC=1sJb. 7SwKL^*rڨgRvCE@Wv}[`lrX0b[q u:x^\ ]\|R X pSvd\:c4y.??ysNf/73u|(|6}Im&#g\IeB>@n1wycf+s}sOS )iLmcjc>y-~Α_O'>?-wn f-Ÿ/K9E䔰 }]=LY//lH)T;|}M.x@0jwbu5߁0ɴީk.ry2KE랪5⚛K<*.5U; ׸{ºTwR5U&#ԚvKՒKsS˹tӁcPGye*s[߸To,{_\/ҒSĬJyǜQi~?!}i?<{hpmk{?H-)_=N~\hWN>1۽뎼wgޙE+kbScwjIrCJTJףhOfG)Wt՗j T0ER._ ]PfA@'nW74޽#v&`N+8`"kxT*,|ćѫ4?IQ7~ um~ͻwޗ4pǏO]鬤 _:arwznt{^ʂvz_t?ϷuOFvK>`7C.{f8\-;ڼk5)_4/`Z ҁ~eLR{ۿLr' 1w{h,x2pD57|<d[Xim3F­k:.cQi cJ !E5la?g|<<`3ynp^x @}ÛI{963x NlI }>|<.|mgN E^y=sU12. EUqDپ/zNb"Yz=S*y+–_VZSrcYw9KfݒaNYnB3@B0k"! xY413$N R 1ܲaN0/͙ˌ҆akS*SNSBɽ Kt)!ǯ H-CeS<ox&V1!-DslH-jThxS);-1V.&S0eѓd?v;ϫeO;wXzbΘ>(z~<H=M|wKM+J?ſ^Q2pOquU%/i;Am>JP-}4B0.i:%3iǎ3 XvWԗ0? |</(=7XN'87L1kK5iq g*rԽ{Fl]DxJc* kІn3<#|OGlJ+Gxk<@GL ]?_1fZ>ܯ?BG4D4&S?PUp1Na>u+BQˊ=YcŢᲂ$Ǘ,Qz1JB6sئG2Q/7xX{օ@GL% α|V 2̱aIk Ac!jcb gQd>.f#tgro0rL6t *Rx 00O sn6rNyOKj:Q8mr#Opɷ%2#ga \3rEj!viVXeӝɩ>@2dU<=oMsܭY:DPQMs-BbTޢ mjE÷4Yڻ1Pcء+DkLR8yu5祳5[=l3|3 z)>C7m7i؆zo5F@uDz=o>/Pc=d1ce:WX2*#Wa﫤UV Fo .' Nd#} ŌW;tBl Vk-u7FV-utT=JKA/WKRŰ\L$ ,q?q #-WGvHsQ e'^cDC$O15 }[`4G2d}.U!iZyb]-PPW.ǢPUhUE-˄[SE|"j[TuR\rGzc\.* FFC܈OʄXmO5ɀ鈠5mRj[hϹ%,.}8BG)72y%1oO'P'kU`!{]uSOY%ͩdJZՕV1roCJAI{YoPAl[NPpQrRx"d)5K 8[}ܥ#ԨDGHX Ԫ#IFF0L:7棅%6Z|~}PxYqA'h@gD]~!g]}&F{A <:w v}YV}F:+[7.ɍ`m[v䅐I52BU)ŗSU(jSC+nJ) &{c,9GX#fp_Reeܞ|>/aJ5.$|CK|QS<:_U C%Kņp .HB. bFF1Wh^u{<g07G'x@xV۲.'H7?z¦+g ާ8r8OpZS9Op5GKu r:L+72',幑F6FByka^׹*6xPilTaJgrNs؜fV`\5[GSMJYI#awf,+n,%,.%@cRXod<,.cI%{8\NpyoiU^8y$s=NJAUP\gY7EBKF^RekvMQ4^0\Qz{Xqڸ*^AP..(I52ʱeʽD9(יr _/|r6|2YŃYEbuƗ` iCF3wЀD6^)|72̗McB܆Fkkd1Z2j쀊5xchk$N2\H6PԹ&`O?IpD3)k=u86&(x`Qɶ) \zB`w ˸KUnzl6`c] P-ڟO'YEU ]>hfÒ2u>BhDA+Io"奉/*1W-^#We Y Q P2-&/x ~6θ:Z:1VI%W˻v9\۪E.4:xheP9PݚDBEQ)\ q|QJHQs(%_ԟ+d`77b ɯ{ 8 n`ds*,J@csݮv8X=nax$M$GDs#19haA9; Bƃ*%˘`d.V4x@A} (Veܶ͟\귓p]^ʸxfo7BRI22T-\>OȄ 'O줺9o*Bq,\c1Ω`Y e.kٛھe FX %`d4.km.SA绹MlR%x@F-N9$!܂<ɉTggփ N`d4NNHh5۪&;y'z'GcZ ͩRqM $|N^@cȚ?V^]|*aWwpFϔ4#axRh̔ LI;C85ƭ|{@` ǑG,Tqv rdf7+݀at##sĎܠّ۾ 5tQ`6J&x8a'" 8f~yl6Rf#8LXz,Zv[#K,ρ%C&RnA]K'K[{u]4(.K'< L"$d)y72hRҥj4驘qSj t@~6ns^!Un, WdHkĕ^#?f8X+j3\I^11V+FtWM>ơ#{@-ʯC'v̡F/iIƲ72FTCU{rOYoA[ i\l|MSR^K~An s)jzY{_ DvqҜ0nowotĆwixZ4X7.(7V[폹_ &_]LgM'Fmo҅ϭ7Sieɫ@^ .?t^\y׶#(2zS4"0)wHPфh #Pn@GPLB%-6׺A\JY蹌NđZ͸Uסu#*mZ~Ko+ d{If(1c9bXlJ(3cb(;,߀-,M_hD'`>{ Oe=f{D(豇klȯ)uFA k{&9ih"RH"& $Jc/bDu1}E3=,3tZϡcC/L;_xX~o*O5.g$h]^\Oߟ~ëZ{[K[e4LokMA?>Jп}d$h+F # 0=JW,(,dNnDH̾UѾ=:}}u~O~qvoh켓TTүoni6lj6X~Y1_! ͕D7ZGWϪ68ysvy-Ab6ǚ![鬻|{_n{Xlj,zzORJu/M=jZ ۉڵ_jO{-W7%vqky9mi)c[6kCogyܹvݼzzpgV9J|@P5GK=jqWjDEC-(xg1VF'ZI|=u~]}wz+N1=f\>uӄ˲miq_e֒Ek[h}n>rX?|;.li 0ֆ1Ud*UL!b eQo_V[͠I?_]~]vWV;(UZb4>2Wtϝ]բK?JYП a\-b_yJ)ڧǛIЅ=;a`κOVg0I+q?$f4n;;d/{X.GF_+ٓ_[ڒe5--4hu,*Ū=Sنf҅-y͏.ۛ9nT}LOӌ&ЏtmL,}ܓۼd&Ao7?ׇ=;)q2|8kl걖AzHvU-gx{2}賙V?h=y%09dd`!z&ʞwxwnfnxj[Iyؕ&Yq'O.ӛ7'Ԧ@s1RgyJ Gbܫ(t1T:$W' dz0Mۆx[Vջ~M46]LÓ]7G\OYvgE[[)FN>OӶ|K>.Xh#VJ(̐UB䭫䣙R̠l|E!˛|v|M#U!_%rOoO?9}x~{~zFڤcaJ3s IiǾ&`q+l>}닖f( |[ݻ5åFJ'H2f"hlP*aߛXr|B33~xsM]rc4HS4g,j1>U.Ci%po]=l6Imm4?mEQXdk>*d#2N*<6ˆ A@B*fKԲX:Ύ߽MһL?6-պ$]VпLp?tAMJ#ΦQй$>ZS%5bC6EQk1i'kfQyfQI 凱_Qf0XT] $%G&>nY4= nC[&{ep׆v,uv!ہv aR nXИ2=_ت}5ѱI E !D[C P?4 ЃP 1\_Rxwͩe~O cV_?v<ҀVٯg9=(r?G8~|e>j z~b(xFź]E4BWy2>Q^L團H-Tm3Ô. %>ΖtheLC + aqL|m8:+,`,5H6 Yj!FK~,xjsάKd'Zx}Ed81r}Nb},1XH&ë{sYT%}vwLw&/ĸT(v>!czK`j A"YWOҵE_yv!"xT"F`GD%GC%n9t-~P0dn~!9Ĉoh wMIw$]x|e񵞲ޯNq2ݱK&:Ul=jYcd^ 6Z` c$IXu@%fq%Y?wL@6bϕ#}ĜmewSmEhuTڠ8G 1imň-`5Z"GU~oTFQiQ*B֧ RڊU\L|ˆ7֒|8 17v zF~2/%0IieJ[X]Q<_vCsn%$׍n%nbCLȉV#)߽Z|e-u+ o=x`z8; ;PN0-Gg_p9]hW/j[3ċQcɘ.2v2h%`GVirz+23|Kf;2&kr ֣hyBvYm7$ !6L*oI``c? /Ř +b(6!fwn^%MЬ$ݧ2##ΩʓAk|.T ל 9$ f AX&-y@C='X "7xP@dES$М;G٨ݚ$(20 T1Pu2+`FUM U;RA@Vɒ.!Hb ҧb~z}7ΣN"OYs_`$~6Fx2=RKLA[U@@P)Ɂ%$$0."Gl]J/@>aJPAbu Ta:ࡃ5H/! ϔ$)6hVH0cĶ?!z7e#KbحLE1YLN9DP?釈\@ va("d 4\-F?YiI7cB8gɩPB 6r"b]&5Idi>V;,vN@5:k4RYxF6H4GoEU"ĶI*w+(aQ6h`@j[YouB4EN``w|۽]t:;CǛǴWgzs+H Xd0u quFFf=*k \n UN2YUSŰl6-GQZp4v5崻8ӳA[J3B&=h7,JxKTrX*1j=#F s;D˱ͨT`=AA0~J%$=|q`ʍA=#` oު0l̙ 'V"ڕci\ (\I!簾}o75M^/Q^0 dZػ c, KtT#!&AUOTuN\SclF%~e6AcsZj㷱rrL?$hYXAZצg&Uttd2 b%jΖ}b~B)Yu~{Pf|0?jި!Aj+Ũ}$kF OUd0+EHH ΀z"re4 o{~p 0ýbݱ﫼ֲaKؠ͙p>y1Jje Xr(:䖬Ea-HR}^Uj>P&_/r`ߎ~>x?ӄnz4{柏E\]𶿞}1^?b7yjȩ]`sZQr?櫾{+:/1tm\2ⵏT}u1Eyo)K07}3oo'J LVL8}=:܅EU{l^/lߔ^ e!`*#X Bj~!`Q{_k1~)awRF$ ^}Kɳk2[5;5'[y!`ᰏ9ۏUv!`'Zn,~!`ɓ?Mg/KI] -e֞A k$_X?0 Ύ=Rp88m{{R&*l‡UDnGŀ~X)ք`}XXSK RژeewVWRæ.,(Ev`CWZ%6R)RbT±XRF-%f$굂}X+^|jsec~_FOLIJzv)<^]m6GY|P?^m诛 =A};_A|>,v=bF;Rv>"= [Kww}{~[_ލtd-G[޴}w}sQt݃;Ow60/]$^lg\ fs)2|ϚoO:K{^u3ߥMmW9opͽJ}ءս ̎g׽}lķ[\cŞ~_=3ҷWvs;{6۞ W4FHEz?cX& $E'96_?Jd}$~^!>ByN5ZCUޖHezލײL}Ԝ{ԻMmEٟE6-1\%n'*H]D:*No:B`#lh$ӾZUB!E2$1brn^vMTᘟOiБ,RԦ BP&PD,0LJZ&WK dvkQ;okE|:Z0K6ڹ .iڶcK1(qo7CmvvOtһK#WLz;p TU$B"c/p< zJD38ft-EiMH]VՇS.8?^)E2J./kg jw3d ChtJL_6T+&l,yi94N̔J+$0X^vlDg&ʋYLW! uH,R$)Zݛ="Hv+sxAǪZTMͱ)1ڄH#ϩR.m{}ƌ$ׂN( <5Xdr01l{uE_#8: Aяkcm^ -H┑PR * CdmoY ^gdYYbTSEcj[lVJZah1$aD87މUmXn%Ġ:*D{I.%醺zgxz*dP޴L9kqGnhkŀE F++p-SaG ܎mf ʬaD]tC'.}(eIb1cml0TX&&EiޘeTRZJԆ,dD@ ڤ4TkrQjU.hd%|.b4j'-`lEDvXve s&xd60gȄڛOffw:]]o[G+;\Y6;b X$ Yc>.iID7"E۷T>}f5,GB> AF*n2ƿJ|=rb22yi vM U{CZY$f6 a[o]@ Dc8HY`.e{k]TT=tѷ t/XYy 0L;bDbpbR8tR 3gD֑XY8oL&v_@fUPЦb8#DC4GQF ] Bg k;"JQвw/uJUf4+Q)6F5&Z܊Ayh3/#ɷnkMBiPe/Etq . #f`ogBQƌ VzhAC^0smۡŌJQV̺}g'cB*66"I)4B &]B}wV}l`c3L}2Z0@ƛG G^ڟ` dULAJdЊPX$T^y~8"Lj@c;NGOdJqU8M6m C~,:5 k*Z%9O %'Ghׄ dL 7#ҷfĞt*z7LJ4𠗨-a\M1WTsC<݈ܡOU0MAAv-hH-nEmIO#-rXoު'ͰlCV XQW4W6b]YM5ߩZ4%EQ^q[ dp) -*1"@u裤vnՓf]BG*Ԇ.>FjdM,]Ɔ137 C5k+MQL%4VjX:C.#TK1>d?!]pokP׮CԲzOU lW`T>C尻,yq V+y#-[SZ4qԋ!W&-υhz~)Q322Qhf=5FT& Gm(6 qŝVo pH4(.gTS.*w;-(1ˡꘉf ؤӹ IF1iBe. Zf J#pA;t <]1w9 T Gnlz 8 :1Љ֝dhOM쎺r?}`^3~IhN@B"O[~uw.bn.ޚt~.;j{I]t'򟓳.7gw7D>n|xfOz\ݼzRi>%Q"HK?ɶ7n nln![{ӾF5/I#}6B@.=:GU C3:C3:C3:C3:C3:C3:C3:C3:C3:C3:C3:C3:C3:C3:C3:C3:C3:C3:C3:C3:C3:C3:C3:VDs`F^Pzb:@%.:9_Φ d0=|hȍhȍhȍhȍhȍhȍhȍhȍhȍhȍhȍhȍhȍhȍhȍhȍhȍhȍhȍhȍhȍhȍʍ"^ktoˑӋ-g/7HCn{                      <';ߥq{t;뿭^p);A~O_X/՛x`{*GNzFfڧyV Y~L@vz&`AG  r!`Y/%f3 vj:*X>,Rn!`UeA煀5A9|+h)`}D,%up1 h;@*Db:Dv!`M$BHA-ed]M` vj)LZLQ)sܘAy!`UhVGox)`MdЃ#/e` sQv!`G:b:S""-"c؃= &RF֐~)1k֑n)`=cRħebG?"PU`!yV)RV5 kVV-UZ{N lZXV-lP"{[بD{T@V)&r ֨j)`,.C:2E<^XZd q/Fm99%T  h)VkVَw& pEaFWtw1{{ ߲ʭ$~o[o1yxu}ǝ.~09}'T{߻HM~y5>=㫴)|d<s>o\^~s0CL0~9Kڼ7 ^Dz{ J>k\u:?oDß{觷ُ<]*ͻ=߮ ϛBۏG_sqv>ݸ;}{5_f#DSab^ ܡ|x`>_0,{ $/nO}|>yh5+#7=/W \9i%mC%]M`jrbm|bP!ƫ]!)_đ$k$g }?m=BrܵѶ3e19+m٩DX59*gkx}ѫ甭 PPL) Jmi)K5ZNaoUT}*_I;1iNvK[ qQʌ`q⬜mQŀ쵊SViu~^ZC@![K&ׂMi\FܕUS4)Tp/Z m01ү/.^R틥ޝ+U޹;RV]b[n2klkʦFcĜDh{OG T0qk"1sL.'J[lb/9%\Ͼ#6ZQoכ~KVXqdM$n4kHǕMѤ+t*.֎L%?}4h*gR=p/sqZ9G"W@x̖~aҾHo<%Sa/bH1#3d7Ŕ)Լͽ̪ZJjw[j(%%MIB{$UNT cﴚbD:F7ŭ'PIfl -BGYG( c5fIA|hFDk4Ya炬D1bCڴO6+HuWy|R^Ybӗ`r ` NMnƢ37%9GO{݉#c,kHuQ ўB ޹6TUGݎ`oĨp"S%@L xnv,*60:- \ 4snM86Y ds,|U_dp-C*݂8քd/{m0':k6%ߘ bWbוk.6&P]-:ƬG5LEkmɲEbh;ؙilOc?tbU1VGIΣ\RIJeu!@yXusuX"""m$gSBNPP*؜pW2 Mq>Q8/Hѻ^^jN-`u:6:I&3U Rcb*ށ&LHpi+pn9QA:9x4~5Ta.8XpfLA@:ٯ'(!Ȯv΄L'q"X'@e ´5j Da&BB#2umݭt,Ӿ5ֺ ,,xߍL; q bP* 0QeecPI!0δr*P%@ `5-775L7@@GCJNJ.LEs֚1(J8z]1HWo`Zf{,+fAQ 2ABY駔APSU B궺25RmW[JDuϛ3RDW#ήjEVh_U`H fq9 5&l23ګ6u+ֈK6%%4UϖEvN'Ю^]0}w'v!~)oI#jĶ6`-$ګ< ƳAJ{~2ѬdpҫB`RE-{H1L:&q '`EzF\µ$:DNK WD2z+TvLt~;XeGd/. j, 7B6 f|tXTuW% 9˨ϣ#V14JX6L1*e9* C$DvC'nׯV;wy2fuu X-j>b =N D@6۷ՌK yk"-A݅Ze}t Բ H H j!YP*ZiDŽ@p s^OB.hS }x6uQ$\InZў tC @$B2Б6qFsޱe+I@̤8&Jk=m?!bDA;H-j,*zaռa|o&%T0FtISjhLE?HcvyqU&AK $`ڢH7h!`3 [NT,igy.P N=@yM;oޫM>qGs80(QC^"foHSuB6G= +5}F!k wzB5 RT"ҋX wBA)7҃V7Q1M`?oyE Rp:9 Hl嗀4R~FDyB2aNPAeQDR:+:Uc ޞA le?fT 46ZޱW9(פF8/hʿۓ>Ѓ7As8p kJAPXKPL(-th5EP+B}~ZCSk0۷#Z{ L(=Ge(uh℞Tt+ 滆 Z6lH\U7.u b!]#c,hV*zZ#BERCdN](dI bh$dip(u*? t+N%g, R GR/Zfֳ̮ٲ[ خ 3 &Y^XPIa@H}AqzG eix%bS,T.VAxEᅥ^ZaPn.AkrX^ƺyckNX/s{E:"՗۵~娫Hq n^z?X-ʹSn:,v0O>ޏ9Y-V?8e:tts8[,qb {MNF0\~قΔƀǾh<7h}PPT z~(1e eؠPr~x$ ={?z#fP)WPUPUPUPUPUPUPUPUPUPUPUPUPUPUPUPUPUPUPUPUPUPUPUP/j!̀\kB!سCP9UPUPUPUPUPUPUPUPUPUPUPUPUPUPUPUPUPUPUPUPUPUPUPP i+ގ^<t oZ Vo1ʟciѹ\mm1]^vY>]v|4T fhݣmQqX(qQb CX1WZ  vKӤ𡴬1=(c u99M&C#k JJzV aPj!ݑ`P©BK;u( qKq#X„ ,h `pځ5X%12J5PbEIX/;nun`9;aR9s(`s+XSzQIӮDj2g73>0vh[PSN: fծoLBm~sU|~>NAn m1Js;6],oQEg9].(C+j94ƫOo8CNJ}]].ǓXyF0MsKfE"pu Gi4vD~Ku_c׍四|P \I $% bRxpfb{*Ф({x z=z#r˺Ɋ2WM"UkDB(Q"[?n)) {|eQ'iYAtnBt ]:hhIIayK(V9[ge\Q_@:~h-|~XI3=if-MڰCiobEJIx~/j7v㽒FK aPL`ҵt5{Q4SPa6U 59j9M f36摒{݀tBUO]|i-~.«8,6:ٲ60Oy~v4uE֦v?f?ﯖh,|7S0Lƿ~y};roo VfϿn>vVZ-DYge14^. U8Rx\]_[rK]-zf9ƍ:6Ԟ,oH˟s| _h1?`OgNX}q63Cqgod)'\)8MYia g 12S1B0uGqi6xer27q7b$˯~Z NfX {sFW.k Fٻo7o(9#[o5o~*#<]=&]$;jR[%#>{u#i<Խ|\2w(wy/߂}Ǝ`=#*\.Xe߆Us"5H;w]lp[&\}2 >l/vn ݄3R]D =v[Ima_ޙ?W;^v<\[|-W .}iGk{\Ԣ Q;YimN\(xtv NoCnGS`/xW6$_w\UՂ|UQj| 3s&ۚO{Jl<1,fԿwǛ}Zn SL}I/4%g %1_wװq?;ڳYD|B񈇏ÂVhô}ø~XՒ{͊wU>=*q:Z\̪XW-e#Y҆ d:6bwcg=}MXwOAֵ):@Zu\Nz;[Ov׊LO[kښR ֒Al"VT\dK䤙 ƒop.jFTm~_kowiE*I.fΪ=@XQ$G8!p Z-.hZWtY:]b.4{=` SZaEicIp2 P6l`1LۨᬅE?!hdXaO\,5vYcsQmHFRRH!Zülc8["xdu1:vYi6J;6hvCh?/pynx<[}s1KP ;|^vbﯨ?XdKMHebt\I) %=xn1+j>/R%"i-E0R¸KvJ#ވh0Y'wUO?)<[߄%G>DÒf5Fb 3c1g!NEo+!Pldό\E-<,A\~;3赜Cl~wp 6v /6S9lƑ4F$9q^E"#զ"Ks7|FeoynyfYm7rWX56>|.!!Ó]6D`Qpǭ7. RrQRJ,Ֆ {Ҿ,^05|!3S&X&1\s_() ^d> wl?dO.~2yI0e> [-QyDJIeV(" \$Z-{\I;ȿiEiIBTIH"J*EìҚt9GK8" &f ^}b|Sp=|P-H_3mpXD!GXm9rrAiyw-]LӖtq=N#w !gO($K'Ç=VĨf2qK8Bv &<~{^9TYi%˽"iwl3_ Ӹ%dy?! hGπ%IQ"$ȬHJ;ɜV)X9GK8"gԌ6$"1{˃$<)_\!,*@:Xã%ry_;5,N^!**\${!uB&HYc-r q=Ep"7mЂ? [BLz!ZDm`Lq3% ,S!:yn]I!DZ|hמ(򞢕i|w&MÀ$S@3M\ iSpZ-{\6ΚE P)OU,2 Ld``?6CfhD|EJS>e?>yH/e{RYY+쵼q?f{]xsgOWKnu'@kV0fp5^i_Q1VaCI>F_lDktI(F(bF8E:5Jc~P][v<]tA F(b.Q֜T#W8b_|1=4:D? 7e9j![pv*olZ[}ݖZI56CܥxNKy\\Ƚg&Z)ýQSqWv$M֩T;w?΄/rXG^$hʄt$%iSҴGDh=QBR1xj&>tp‹zq}Q<({bd'G]Ӳ%[yO*;\Ly!N%gDrɤ!018i1]Ӯ%WyOj3\[ԫ<{噠Kcہ.YW,YROL8eϛlef#ڴ3Q=i-~LJi-q{s0יb,$PQlIG͉>PD(nwXܢ#MY3=4V8=l[g|"N\6DEL4n'{ }XlqW{}/{U 䚒l$kkql?Ws@帾 WK`8\durT5'%xE\JײphGW|ܸELaI4NY Ǿs5F#}4QvTEWfh?M~#9 ғYFM|<4j5BC$Zd DU03<$il.g׋bqZ-M-zȢ2 F씝!!arBе#!)G\ƃpS4\GVG(btzRޒ~O̝ԮNNl(|zlTGCb+yJZK) c"|$mMVa;+EWƇo{(: 慮"6KbDPy#dD㶐Q},h O&# +CxʔH\5jC:8ᐗV_=U;OD$!EKJ Ju"6^DEWR6M{fΣ_6&XRɜ>6V}4G{AN $ fO\QwÇ$@n5u{Zxګz1%(TOj9/=p0ȹ%-kbڶ#+_gdz։Wߪcʚ^j DhfrBXЧZ h4yϵ$;1떭Cڢ=,}ZLCG 4svM4~ j^#1:8e#KsJ7IF!Lt"%3wD];t"+ZNZ:v,BkS;!.q5}[3^5!k*"T*#ccGi;5oǓɢLrb btr™Zp>B:E`d K a:;.}b?20}E=ݳ>fWH#1Nz`uʱ}3ȆZf-Ju,#nq g ia m~3ݧڦ 2$}fՆ'IEJO:T|ǣj"zvjNё)p]#?Z<8Tz6Ӎp}2|^ ϓ(wG7 q=uWyJ!Yl$@.o'HvE{(˖d[K҈b`gȢYb,~A'WrIIS4DDho\ q=:.sw_OGV0!!J8 >S:xOW8oטԣNI}S)8޳A|=çtqݣ#R[)iN?ҤQ B P0V?E&ײҰ}N8HA_P+l-6H\+""c2'yAQhħN9SsӀ:@v:BW镗‚/'HQc_ i,@S9G!{(뷹;U\qRQ7җEFpR#Ne4΍ maY Bħ.MT=)iS]sx ً}վ?ƀ@K&Xqu_|zՆ7 kJċe@$ёrn"}|ty !Ck!@4'|0ɐK Bb& ?JasWҮ]:/vڬ*JoҶ`!jJQ!t)i)i)AS'[K9pPi؜Bbg B.i!>og S=#lppˈ"̢55laZZ*P^#eN]$gPxo<0xUqYS,"i`=6 "|[M 57W *l{a貫IJSw{kい{7E #1"C4ZRd2H@}&Tfmym6jrg? IpݮLdigNhM3ݕ%`<k48ZϼX} /DTX$ZBʋċnQ^xrkV!Λ/Z[ڌ^fʰǹQgͫUaqծ(UCy>fE1 ͬ7pϫ??2I Ts=tJ`nҷ8Q] T\]|qf7_x=[~W?suخaU*߭?nV.G?:|{Z~UMŪB򻠣t?s2h< x/zvci+sdX!‰f\ ^"d0HM N`/Ɠ}f%}}τC}f5E|:bF8S lJ.u ZBu:ߡNFZ $LϚ;Ri |[ 9@8ѝOsx5q gEN8`A {s. Ntahp0k-Z`҈p}%5)ˌp3RտFW޳P+V/Th2xՂ^'6zdP:."R4nY%koEoZ{GȢ"дV dJ*ns: J$QX.]u {P" n aY־?h$W5+`?=JW[$ ~{7;aIqh^vKWca,ax>^FI! 5z"k otdyEChSp.qce]oxTx{_E#~߅t[Ď $~Tҽ_%WߥX+OXPV魩/BgG+tXVHid+`>iNmnsj(N tv!<- V7r[z4Zi vZ.% +=ˬG۬LXIsu&tҀ> S.IK<5yL^.ː㾼+p%x H6::H|⤶BTS2\;eIHqz!lw yVjnIGMM8Zd*&sl.G?X袪COWG ܂ۮ:NPLd2&uFW{<:v2=Jr*6!.$.By_O認8DcI| @|tW ԤGr' )6#KgB^t.[|g@-^B=q v</?<8m! /J- /)to ,$VJBuw5[26[2H|lw 8OҨI8rF䳂y;/E@#kxyy~qrH)O,?h@Đ=:%4aBqb=Wܳ*C0A뷰6(f-5%zUjjk?h+. gY!~26i[KU>"O5^HN`eiyQH9!`gdj^<@ħ.WsZƣ⯯;OW3./nƾG E\lHg- ħ%oX.茁X"Ǥ}I } DZZ c44|z,:j:hRy0Њ؏Ϳ"G2@V?G"&!d5LُڸVWK"[[_J:z~0qєGz$%μֲ(.3IGZ$ˤ̬auuokk˜黶HZmѭu;j }7`@$c&l/sD鷁-'42">mZ7;4ucq@z+eA ![23Q>ۭhv9['[B4ն ?ulaqEbUk݁,x`r+k0HgeIC`E:2TxbmnګG$>EaλǐzR8 +r'k+8)l:==ħN*oRϴ, 蹦fO? SgU(o;M[Im22Kvu_asIHA@ĕgˆ M3a5..Or=#`Ey7},tq OG[ħP 檴tJ{t$\~̴TD 0@OT3c|"? R?Z%ã֕e85غKU)Efe?X9@ yWd!`.Dn!Ed^XހBGN*+pYvkuY<@BpRv  =&\dM} @#}) (eW'v" Suܧ!XGGʹ$񎥞B0 r aA]7ϴQA88O9֔ħN9' EnɢNR}y}RȰv{LQ:IG ޅl]ΈŎ(.IM$ ;c>^*vw݁^}0WdVZl~z=tk XῸ.wG^h 2+JSqPo~Nڕ 3na_o1gGGhx SgdEj,ڿA!Pv sApe/2r=H92^a@ 踯GGʇ~(ެGMM\߄qx2ı*C<Y\d&4jsƀħN =#PAyb7JtI_O @=: 3x6:{ i"t1՘Mt>@-]A X,rL;@H|$s3x zuN5%rK7 -`i=gsLv eq/P4)hu^MKħ:>w3h8B#Vd6D $TYYK[2U]^i@붟Osgb~YIa90!f5%br."r*x%1ơR;$F9KA)#sx:"<Xpdž" 4'E$v+޿H'BpD# ?~Rx鿎 vo/@>'}M20dHrx:Wz^H3b8Uvi?~sQZuw$\l(W65h}Rt~6M8cgn܌f@Q1"%76LGx0H&/)tHlr!-FVJ x*mך'cB s=}ૻ^CJFoU$xnFOք諭 :jK8XTxj.qMl`{]qě "٠slH D8slEdcj&1Z[2S"k ^F="A u`BM kazOf4G+m-QֿSkxGAzoy2ppf>K0^,E|CRD?cCN0`^"{Ѷvܩ^<{~"8ŠWY’KDɐP|7#G#V /Vh~zPէ_ 0Z# por=ELI.Xg |zN1`́1D3;N1Ntp.Ի-z5,Mq$Kŏ=_8Bǻ\J&AJ38SǞKd"D\V\b|.U׬D0wtFcϢ U WI>U| y-}ns\tZ-ZI;t0Ԣ0I ܬF\跩a$Eb`.S*^!pc $' GDeehRROh5 ŒAx g͑C`$:#t;Fr 3Iq@!r y>PKޣoc'"q62CFK|Ql8+Q/m=`_XIg04@?@d>ʅmJ8B4ʔ%sFe&XHe3K4׽ׯۯ@Ӽ!#!;^ҥ=]kWf|e+%¢gJ:lW=FaStpW=YLL>7]u k/OUGⷾž2uC#{<ǧ?;d ,WV0ߤNSf`|8Gz$6k|=zI)+ł|60;G/V |ui~\W?xm)TOA:7v:3F6 tOrXߝk.1Ѱ)S3i1xr?&.-Fd9)6U9pkf"-Gb3ߚqUiez7749mQF3})4Yp&A8a%k$Ykj,; ]AVoSv=6cT ="4}FO4҃zy9zYA3]1ϧ+vŮbq0ksyycێJw j-f =@cGmynQ޻}Un}ۤow$9HKR]2kyH y'-)`ޝxѾ_2f;iA`beӽlLF6 Ԅ[@}@0Ny}QOUT(84/[Ͷ`bs\SYA:y^k}5koxdE7S#l@/p}"PJjPɖgߞ8քh^_vN h߹_10$3XXĀ 3Uzc xN`Ib*#IK.UdMYV,2ˁr!aDđ%:"q3yTۣ"|q@H_:v1K(XɄhi2arr[5i y[gTXFia1BXZAPFqf),"]6jM>suV<*Ir;Cg.QpD//2ey qin d))ǨuIkBtns =ΈΈ68NX,'EMʳLE]8'1Ǭ4sP}BKB[c%Z,2T}[Mz0ţ1)HKӵ|mXژ*x6Ѷ?ن!+*2t;+p&&N`eg7Q`8VR XnU8v/o<\3L%˭@0ό YN#"4Y-$Y$c-l#ƅRa/QH:2A{0n0/XOn{gƍ{mExO:f ˮk.xGcy2}m>rǸ /yN>M'|V].b& ͸!€J-e3tИra,tNc9ʁy&$ZA\%k97&UD)i᝱\8Y nG*u3D2'"ӸiҌSSE&(a{%7ܵ/QSW[@̪#.c MT7>$NKQVT>aQ uv_.ju.H2ED 9a"t1~G ^ƕНd)Аr|F"\ ^w9TPR!YˀcόV~9qιR 8$jat9헨*hٲՙo8( K/NP^S/\sE!<<t` \gôо-v Ol Fᚥz?&hKIAo!E 0RqmRħwXzC<ܳs/̳s/a^2qS\%/:R2,ˈR*"<2$Vh:hYf9 PKr~p/#xB*v6uBo6w,%$#2'kY=Y=(dPV3GhànCo6(`aP,A(S9< hUß; [uނh =(B`DaCQm\q_M|AXmxLdl]PNe1-xV AmQ %W|}8d^ۧ'|ٮ:xY8jFm"`sqB=j -S2AO*XC< SR˗._ؠ >2bD !cxqƄx?n:_^SY[lAk-M=B ,7\OD)p>Rc3I^u [.J3ؕ ,x^T;A:}>L"hW y;-wg$y`jhKfLQAc&ˇaXLss hG'VbY•g*#:Iii䫆Ɗaࡥ1*,p_Lf[Mζ!9óm ~#+7WR"'(39}2$J:Y=SY?TIhK-V%F]%I Qµ:cqB;؍ynFp G~u!LqSjMREr">6'I/ "%3s9ooB>Kܶ~7~  (di_Ϫ;%zkW;q'd l fQCȦD0P'bed53у/6n`Hl;yF p,5 KS5pNi 6VU*Ml%+C߹sX: ۍv@HcaBM~?+OQ_W {](tf #*n/+\\9=.d(ޕ)"qnn2WrCŮdGm̔ rĵHC*ɨ(M)>؂Htyꯠb ,eB$ƪu -̗L3>Czȗ &lnIlx*Mv9).l>B>:k׍ S<(o7Rj-B89ҽb%, `J#d3b%29 ~.;G$ؘF,?9-~3Ө/isVk } $|g=+qb:I[JAbdDe񛌴*R38Wp_ƯY گm&oGuȐW(m)OhO9 ]f1RIbh.8#5ʥLkȧOɧ:T?x0n^I0'|dӧ\8Qq}4 ~ן$aWdK}Qn27ePT fj)~}ifJqp}Z/"3mj/ ҤCoD G L p/?So}~%kWGW*t0#LTކ: ~fUɘe=ِ$^p(qT`3E2$gV4q좴EmS26oE#*TO0"MwUbII*ʺ&8d'^u*͇ 7U?^LAoM͒hPOQ ˪q!F %a9׼3 ?ξ?j4T׆W+G^nh2SN_@*aE,&#mh?ChkV7\I$oc͇r3(z(^4ŕ4ZY5֪;=yZ:58u {%G%vxl#m_-3.P*xLؘĭӥ/P2jCԓͿq_Iq 6mA8|8Wh1wd5?_|}1~ջʨfd{G5.޴?o|Y'yi#>buB{eW'r)*%Ňz 1}{-(?S½{ʜ6kqv=z1lBo.Ñ36RxW`yCGqLo*L\feky[E~5 [Jtwkدـ㊟O;ºDfu.E%)3(zB:h\|q%_b!G7,'~xH#EgC¤Ԧ)əcnD>DT b/J4Fǩuk"Mi)a[*,A0%,{Uۑ`|&pkdFtKq\'H[&qh)Arˋ2wk!}QtRe~N{~1+_o3k1bqOwmX ݼ_ $Le`AIbYVKeAJ,[6K"K hKs"ύ2>3\t&^r&L&%x6b)s[;Q ^Ҵ JeEsQ$nU۵ߵ˰jʧ[e7ۙmX_Dn/[Bڼ{$reMX~r;_ҏ,AUDnô~7kڠ_'/bHmSQlSp!V%PX$ǯ;r7|<lH<(' mg7oU#Fqg |s~M&=QF/HETSp7H-Al>)hK >X}FK5\dU*6TK,U0.d=7gb<໼n<[)O9z<#> JLI?Ŝ@Lo34(y5-a5p._8B!  +2 H-&zHd~wj藳y5?8^=hDZ +#`BcݞGk1ǯt̶h ~2^ȑPaXe#XA(bVE05h?мuhaN$1 AJfbÑ A6rXNhʠsz~MÞD-i\qY܌Y )i1bMHcp}c XՁ{$ Ript+jH“[92*HbCJ 0P%X *ԻV̛L(N# -AH&R&@yx[nQ#cG7A<pb~=zk򏟵Az|"oKգ=#^;e LyV8BHQB5[cͣ287=;2 aLB[# `[ 0mgɽ *="al`XtPK(GNp$ h(3.ʈXuD~i-CUX-5bĔ!Rr{wWE/HXM_" $UPH`P{,т'?xqmWM,8CK\N6=&xB48Q^\}p&ᘡ)XCNlBTKwKC<*tipzvWʅz2Ӷ &[㾢FSbJyPz5 %Z-.2NS}\DSH )SK$݁kǩ`Qsi$#2GNgyTYm'9f|??p dґiY p R4iCfU*JO f =Ai \A B&x5X'aNH) ͰL}ٖɑ+/W@=?+`3"Wd:qYYX(< XI5NRΞ>ܷ= #De`E1k5]fͣ28 gHLJcs 'Mp;+81N2BrnOQm'aDfRVwD]WqCR"bM`ح_ Et2|JmrF0P1$ JQS0`Py|pL,/3S*8jPՂ%n#̄ɈWy1CvrzٲWƥ@`:M-%?gU-U irG+CBqq]-/墇2l;xT@ hYZkweU'#wk2C2(B u=vImoA6sNE#@I]dcv f%\4s݀EU0|,eSvW]t(NYޢCu@[(i=jk2!{ـ$Vp2&(iB1@(XpYWƅO;W[:@.!Bić՚Gep brvVC2=OVpE4 AieSvĐ!,2xxΰ uGTp v)`bD`G1>yT/d^ ft<*#[q`\c1%Ƃ,/Nu FP(r"$JuW>GD kp$?5{:ψŎ(ӭC3kRӈ&%>}P.xʧfQG-2˅r \e<$̩cO{(F\d`!TI@#ƚ݅Q Z)ۯ՟ڿM2 |:lߣ%YZQi7SXbT=, gFl586f3Ky*_H u %v:^Se\~L.OX;s7S6w U$V[ F Iu`!E%zI2D#T8{#: ɿW.ЀEX ْ@fH n9&ZISN}z*+geNur;X3Ǭd`*礿Gep .-K9۠T 6& Ô[GY (:Q֏¾Қf\o{W uT#JMqfIAnl?Z PjaGЬtJ~I2t) YF) THûGI=r<*W_Os#`G4KBѹF+Ǎ] N$}Zi;f? mg&}2zu -hC2jӯ egv=^J^RT89UGuVV)bEVW2/8zIfYICM2(Y9ewM}[V$W(gZ F{HOAH5MD" 7i28%n'y;lh *1΂MFM.,*C#JY=;50p6 0r(zg4Lvn])^{GVfB!u$S>=܀Lw]5H>$m&`HsCp(д8- rK91RTjl>z4GepJ >5+6l+.ƗӴůܖ_%VQ9 / P@rlB6x`S?Zw /PA?ozAC2flM= { \Ix$\aWp_,޴lǜt;+rǐf4FCcCsU>6s)Pp@@^͡RTSojYV0e@)xhd̈́f ZI-9)f`RGڬU9ZsU! O bF1'۲L!6]UZ{==BgYޚf/ٴ:_28'8:xĠC=NIZeU [+L|-+.8i{BFQZȽy1'.(Gep?]T1 1*yj@6!%U?z訢s#o=RV칃GepIZe% iOVի}*SazE<" 2""@QKO|}ͣ28_]e!c^rW"$kir( gHKU_4fNsžM*%[:s""Vr|Z}fuMV wfx~КBBzw;sMhGf"_U_&moXYX<n7q7Qލ7nPAa?f47'q ĭc Z S:/[y`VYAۘ[w*HcĪ#[ҔY8ée㖛,j n1AI#?b/ fjG/x!#r4qJՍKV4gXV̔,VL1A)wK qoѣ)/7Ҿ6&S~3f?G ~VWe抿܃fnP6,rFnLL?Oƅg8f4沶!%+ {~ϯ|f<+>TMXqol=hyX܃? k&${;)j}wϥˋI83vzuTO~.")%O 㷹Ovա=O#f#1L)B,i4Cr:)ͦZg V,J2-R"3GM}C=g[m[c*[} af(d|8`{Q~=F; jJ棬4;R7M|"s6|ƃyP{9_˓~|#ǟ Lks:Ʀl7ZY~,UdxH41m =SLE,G.f\iA\.$j ww+ Y+߃Ka}6U\L7l~j>? gև{| Z@oT5Fx|D3IH.}&! RoMAJ\f2kKKr*YU=z|૕mi9(>GOWRZ9W 94"&TӉJs$ ^ZNWFpXY"voGC ښ‹p6 E/EsxÐ s(edFo !b-8 VlCLp}&}?^~n=c ~r6*[>ϐj/eL3: Fq+#-_6a `3Am}r@y4U7Sb/f,dt-a8uF92Xɱ>.xix:^˾uA796fF}P]|*',YƥqF2-9Q D kڗ&ϓe47KX-˅v|ۓ޵.fӽҍv-7vu[c$Q&ԨQk',`}t9X ɭש*y7-aX8o h(ՠw @7̘/ȃeaRLMigг{v%Fې'n5 (f2Z{B2z7uWkɍtˍ%EP>`(:;gafG+ô^P 3bԁ(܆] +J{ӱEaDi)!rAc)IzfyZX]dž ~&*‹&#V˫+X)R">oWh!|@eJ9kT+hj9MSzUo yKPcKZ9˄͌Ip*+FRs"x25&`Z+WcɈ^qWж_ˤ3\{1R}ihkQdk*פwkb &t JEG)*!(#EtK :c 5i&:פ#x|W2t"KFE7N@\!0VHQ^E&x?s˙9\"rJAZ ̚4ՄsGhL˜@L͌i"StXkޭ`8M}Z=*Ulʖaީt톷u9$=y,JiÞ$Y$Nm6\gUl~R[gsFWFGj~@z0O }bG-ȫȍ׃"ot z5FE=Ό{ѝ-[{[\r~,Ɯ3U|г>8};8vȊ)۳8dI9X?9;ZT,ޒ 7}M5<!|pL4aE T|N.g8m ǝҜ9mQ˙%Wbz/SUuۣkN$|`ySى7-+4 ogc)Mlסn@x_k|'a {*}i *oON%OGצOdzl|~|Ygθ vqu2S}N؜՟n~6Iz$QګI NmRYNeĭ)!eIJP*K>Up-7,jdܺQ'û='_3Q| '$[ĔKWG64ghnMVfTڅ17C_>{2xI$61xT^`mM3$EݖkToq1!,I[gϣt*Fq1w 4ARUrLAH33Kr_=cActHߨ~_d(b2̀*g ϏلXbeQYP̧0찄 )Z%`/+/ 9@kA{7M9i1(61R258(-'6G7D_>٤)CPМ u^=Ht-uD `1~/Ow~&^]dVQX8(sh,̐ 1*(")&J d2rT*mE%Ԙd⷇ٚ0؝Ơl~~vXꤡWJ_{uylծjخ>/9VN!?=6@ v'lMt {Uw~67^!n[&IL2.A_BDt٦4#1># h u1ǍfmPNdİ,?Lbxk'JT$oM iюB}<@@/I/ixx d ~m Ņ^c+w5|hX$/ףaaJwbhf8pq̰C7jR&6㠃a!#@xJeI*k~NkǏo چI۲8Fݖn IDєʭnO  z d} n%K9J_N9$|Mt׮^< ,ZBCP 6lhXx^~ =Qi+"ݻ|skN+SRD^RřfCOxd1%}Nu\yf/נ_/>Hw A#0IlAd1 _A{zx$GTc^oYXEfk t{cU ؾ&Km+Ao֒TM(5C"gigT4uP:ZgB{?T-kd/#(u!%wޟ5[JIf2|r7wM\8xXo Sd]_ 0J%j.bvF˥F)Mں&!Ixn-,+=d4v]X.)9m%<|nGڋ/ 9O%3sή=AGR ^o`nBzAyli}[h."Vb>?<#.a u#AOn%$ J͗ o+`OeQthťȇD=lMPXbJJ^ZW"%'([0_3M,NYxXtP0)}3%rXݨNcBVYAP?^ץ6SX^4F*Dp:"]GHlPdgWPVzFhu"[H=B`:#:e3(=cF}8liEY9!3/ԣ;|f[p\rڴn6Wn`Ҩf@]3W7 [er'N\I$aDT& ͭDh_qًBS8c?8_rsA8b8*{x,;%!Z\x6$;PrԀ ,iYFt0> vL%"(w8@ $[tp- "tGǣq!Ȃ#ːR #Aorm8j8K&΃h[8K-# sPapLsԂ#.qXB_A ?%,it뾿wΊ(#UҍRi*Z|wCs$`rxKj9+sǒ<5 =(c!s+=JIRPBh )͵ui'a)ZL'_(pbed ?(RƄͭMRtWdbmYaY vW̹$(Oixn92dr% GMiKXi3KTQR$yfY^LV řTIma+MRN0H:猥Davn9R1 " pV.Bi)VF rv$]TL砹/xN ATD%LU#2aqYt$4Ϟ; %~%k:ѱ!}f6x}`1D`wj]^Ck'o_U/Qu[l~Yƿ74m w9,F_Gz;UDHWlȂ[ݞO,IH׮JA[-8.B`u F(LIdF̞>oǔ 0Zp\ IdN"6+EF}6hw>oX8A$!?aA~#Lb}= W߷;) &i00d03,|)i֚ u8 I<0bA= >Ln_:_{ {L>gpN1_HrՇ@\ ,-v. kFeҖ@\.Έ‡̥/ gP30HErFb;$O6%33Jh^DqjiQԁAwʭX{Gu?f)e[Ƀ_͝~P5 zs̱W0^|+ک :%(L,DM^:ZZPļv:uIv:u4F>N^هuƺئuϱ=^{a{h:^ P:Q"(h8&V4R .%U,pɤLbNhN!tǑslϱ?v|؎j🊎]agYDs%[.<4b% k ƥc@cw{R0ů.Dnj5oO7ggx[?XUl~}|v\ffNT/6k~/S jdl`Rv:AQ#˩y$Y#ey"-\!@Cp hjD:uIcH ܣLcYs\ ֢iHCAwߗ)R⼵x"^fb= `EҪ NJD#bV%U )vws>O]9n`ދtr{߲DN@(-qKS4[=W^y /[Z[Wt!UvU׺w;Ii{ .-o!bݻ~Cz>6OuwV͆1jo3LpNNpS-+!`n<BN%E.!7q8a)ǟ 0 BU3ЭZu ӸN^q6iF_¸,m^:>:2U3,DFHPQ`%c#]=*H|C9*_ &['q;Q&>SƧw ߀xP]եOxvvqpi==>:fȳE>f1 .ź]S5kN9mV{)pJSɤށɕ+]^t[eG5#cQ[˭z3nvcD=6]hkB}9wԶ2ߑ9ͤ xs%˒Om!y Jo7o+뻗& -gNm/_zpk,DVo[wh_6Sn,/H FQ*Hٞ("bBFQd/ۃG ^ ,vaf>LYz]>Gp]栢cy7v%˻qXލKWRޡc)?w,= /gn8|^~7i&ʠx夾Q,^e<$QeǥW1WVMdvw\ϋӟi7jL <;__wgc`~ʧ3|))bDDOBơ3)CΎ"m>G5cO{mަw>u$x},Z6KTFr‘#.Xヷ)q;~BBi|_o!xybluN>1<=xcR(-ߙ/rrA1w6τ^D 9+CErҢn_CxXs!JyAdԊ-NF˙bYZY,i J=M/3(jk(/m=-DZHd-ַ̨ThSI% (!hs0M_A xGuk SB1 ,kby DH%s:;k(j^U%΂$AB~r"' cܛ<$1xAm4{⊂P#>FI0O3v5ZLYxy)_oz oke0qjf )SFiEs. M+|cp"N2<.ZRRpVF_AxKoPI*H1BCZ ep8%I'3b5ZgJp4L4*CgirK_ u5ZJ࣐,Ad8oÆ\c)/(k?Y*’xĠ,\ \1ZNr$"5%)x` 5*kT%lBJ␗ y)吟YHRl ґܾB ̪5x Jqɒ&В*GK5wugh$EAQ$Z(I,el|fy ).F^ k'/b !oZ3D)5ZV{n*yIR'%YpW{Ph` 0o,fSA^ M+kpVPHI d@pQs#qB=74+~E@]Y|f#쵱dL*ףP5+*(l@|dvί՗ZOHO$UyVqa%DžPı( (uPh^YLL=e H(ugBme$nF+~N⛍8g8^a5ZW9xH EUb9谡Ro@ 5oֵ1{˃$:uL7%WQausqU!y'VA!/<+Hpk(4oUs:8`e`=C*A!˥ik(oiU V*K) ct^<ܮ1xO-P!?z_lDk(L/(U"JJ 4Q -3.[LD؝O*B4#e` `<HȠça)bYR^TٲYC x\s1kʄ mjZƚ{5ZZ6'"% HN<RB2娡uUޫH4_Jlr!I٬[Cx%u.G( h;1cuWPrMMȥ' Y޵6E܇ ~Xvsd}I&Xik-KO{--ٍݑnMVW~U,VvFU0g\ኳh<*Rur%4l-XO?f:J0,.2_  _&_n)H"˧4tc)Rq̄[5_XX2|&)' <%X`(Dm`poP`G{ ўH#Z <9- c-VEҔDʈ)6TJ88%sya|}_&fysuaFȦ_\NJvw^1GaQd\b"SKna-Ed2)%'1d9e)$m9ȄZ꽓^ 1(IqJD [c^A$P=ޙ&I//TK XF|+W~&!$hQ*Goo߾,+%}4{= HdGecq(ì Qɲ/ QօI{ yS/)[`, C`kY"ĘGT@sfk _QP#}4X^þ"%INn |HPsNR99܇ZpQ‘s J yP8DXxf b, ؅ȀHT½SH̱512t1kqEKhܵÈ&[ *R(Tȑjk4'q@ 퀛u1DK>t4cccF˘P,sF{kXR$X1:D i3$㉃*FxMP[PG2 A Gg(nHy|5z 4=Wd.r0%a!0DoE 9,^Qx]9Bf Fʠj3$/c xSdZ| }0nEJ bֻ"{Ufio/F䌮HM;ƿƟ膃^ Xk&2tenN {FQ%kU*2ĺFyZqٯÇ4{_Oÿ>.\O+~Y^O?ϪyOQɟ/3Te쌚=(_F%8СǰIᦛ'lmP8 fݬ:e7d*.wfʩ}O{җVekiFR9pm[?e>J'0_:qlbJ+mǗ{͗1(![Dߡ#|ΉkKeԗ`hM`d%~Θ ,ҟ `&"N|l8ǹOrͥ!\ kSS^!Lߝڗul2Еn`AW8M]j^*:-~\ݿovf8:l]cZٮ/.K7v}~X=5ԸX|_-S5¦7-WQe=&; @!:)~Xת*.+Xˬgm%e83O(<Ŀ6o珟u?-Wyhf!?oW0[JC C;z#p9-`aG^9>K,ǏAss-{vq'P(2n2@p; fo~鱗 S?eyw ra6=Y(1Nz{aL6wla]U\rCX-:jw#."]@s/YRńPBT ;XI2i= S۠B٦AܨKݍG.CC)0P>< \rJ4F,=girzVUa11`y\k}V+7^qlFpL|YBV)漁鱇u5{uI@ m^b }/C($X`Y{#.%c0H\z R*)j]m,_@B;UR#!&)x'd9gԢ-r :^`σJR̿ egdA3蚄!hjA4vhRl+ucr59&߆crj.kGlc&|$'&}-$!"S0MekxFxq`Os3Ss3z!@y~cP, kfWHTQƷ==hAoDvC|!3A|^(M `TwsSd/lݾ(h##v۶̌M2.q`([ge*ð"I76,e&:_ DŽ 1a=36DW$NdIr.<Ş8b>QJ*RˌvxptN~ ׶k@wPvxzA7ߓ{97U%[;J@5'O8 u9lyK65`'0~t{wB8OWn?H5,k`tuW%[zهo< W{sD̺j`\(nyT,x2 : tҡ碯f:$v3O59P * oK|@.?'?tvgrg8Ou%0ċƎ/aCIERT#7>4^|b]3Γϡر"37ʎwG#Xbx06)'sfӄKg1p[q̵W KHTFpv=X 5N7kSՅdR3`_[&^a|>r;&uuJq|t ᴑhy?)qp=JntmUכ=d&].fgDeqtÞk(/|&ë-'Rx>6R>НPÂ56JTB=upK^-흧"9r-/IK=p;S~ ˙s;Gl_`LJ{%ӏ U  MU<\*ϥ*D-^"Q ~Bh9?^eзWCg:T|D${&AΡD#EဌyPnEz/ae .˝[K|v acZ1TÚ^Ԏ]]۪J; ,@4GTtA©rc*p'υAWF^D\ KP$Z*H&[;JPOH5U bq(R+ĝ7HWA vz3Q\H[QD5Ncp0u# `T"gMfpR9*[ƪ+y$p2~,R!Ύ=N:Xe&)rC=:ܑʩW&3{kUZnRÍ$ݙsoi[VoͣCw6ݩnt|8o{ ifU[˸_\nwy6rN{nhKӻ8|σij|GuAyۑMƬ :gMw>)WD> Q"iu+YJRfT'"ǭdGC@8舸P٨%FDlJѬU/>H<"y஗ 9zB:e"[4ײУ;rzk9+Xd`0?xD+b1Ar~&Z5Q̓^Ei>hu&$)$q:/RMQ;xQᔉQģPLHŸ4tEx"%F8%f82/`3RE񞍍Aw3i1Hd;߯z$Eq(Jv4UUwU,H RWӝ*5b)φ9K>TaJ.`";fafSkb$7c*azf\8LnO潺:?0(|v>/řI3J0ߖ:>>]UĚP2]p*U_5F|޵Il 0yRa]qOM.j糓lMcċq>58|8ǖ.mR>U'Oh>0zꪉuN!Yby h\(4ɒG/@OuNMV6vξJ*g*B+#boK>E?‡gWįqg'ˏo߿x//{L} F`\Glܹ ?*vh뎪_z*/k UJ??? .g,VG$dU$$t ip;yE׀0׀(Bb.s.eE.wzyW*D~N:·׃#cl+n%8Uc,JP4)6w::ƾyp#(wRo;7'fzx|Wh?b:0L) B$DQ3yО;ZV|H1 N9ťoW12#XP1g&"kO8dR.e4L@p$闩k~]4p0ePQ+0G J^B:`^e.DC8Yd+$,r!'{3 r[R\2Gd1}Hؕ-,cYƞi@ QzHLdu4(RjH\:YC:6-eI˒OI{GȮLtOd_#x~O;Z:ymP's#~vҪ+K-ӱ |ğ]k%X dV&]:Z6{'W%:_I]f+JGyaNcF1ZleF͝QFGR :˞Z؉Yʛ?{ kr=nH{rʰBU*f'_q7=x_LJ<#6ܕgxipd&QPx}5Wwt4D{R@F PcB-m=h4-8N0#4'(xAcG(ȸԗ$ĥǛ&jnk:8*^@exRs-CF R,ц}NOTf \6糃.T䥙=05x /5Xo J瓰[s+??a.掠-tحKf UвTKa,%9lWc5@> |l ЂLJЦe] χ)XJ74nѯMQg0(7O~b ډjmbOFZC7z༾- (zʂ'氀X|OM=/Ʒ?;L fn !27D"_!o"sC培!5bBDGQ8rEv D\g\42ۑ !27D"sCdn !Q3B[| o!-d̷2C LIqvйKrz:U ?m51/>}Z]APU-Z2]2hm)heRu6 Yi,o~ 1p\Kd1Jx.4ɼ:qbvf(& %/5ӨXI0x7W,MtuSh|BԻGըj8~i:*ұ*Fn77iqL,-T8\Zj*byg_y^>3FaJ`&0# R鷇ŋ:r=%OV֫ves6Kv%. [W>0%6؟}Ua=.d-wNᛳPb^^tr#Br#XUnSm prQږ8V0 JfI$Hk bLu5}V4f5qKJk*ھ̇LMkQ4KMKahn릫N91t}w0%sH[]F%VU^Wʤ)e |^4۴`$6Um_n&J!3Tl[zE|*ܧġiViksVܢ&Wtw{4Be0mOmO~l-kĮSKz6~=[,vZ`tai65U[&s(QSn 'h oA\3)Rs+Є#uz7Z{nNdr.Gк?ojc%1d%TNs鞤r#J@AXD񀍱*{c;8uJ@lOγhf.dv! yg.{e-d&Ŏ& YYy(W7^{n.lޏ rfF c3WWWoxqv+2`{ٶ]Ho&?Zx'1K; cJ#ϓڐ(Z vVDfwE%K8¤+Rr]GX_NB`gw^JP {\_l'>T@Eq0ȕӨ#(}ewh_ͼo\)epZ=ojՆ{I3i-?DQ.MOˋVQۥBB,ZZJL[%-h w[KJ󭴝{p-#  [z YX8D*ZYPAYn uO Kp3"'DŽ)RbS"!&"7s$#=֜@'N~m͖gQ,y(&J;%wQ"H0#6xm{Yc+1esBA^gYx=X7mIEʠ'Zެٿ7Xby0lVϴ4C0)pOzj9-J-AƒEiC<-Ex7c&cB)7j<[֑aYqGkZ )$.oMpgw6vr6>3;ơB;ytGw.Ź7#J۹9MWi}\Qe2R gKF*ā c 7tC]',JJwC{K5nnJf-uD6 }toxw3Zʄ2a^!X&HSE57V"FMsFV;|x&yY{ &@]{ra(|W2 i06<10c~n,ߵ0-8'-v~oꏪo?,fQySuV[A;QӢ hZ+cxUChzrʁ+..^;:,`5_ySS IK~ճ]ixR{ɱ$Q88+R 3c V0Pr.`9V&gøDM齻R:*?e8]v,orūgMu3LLnG]nP]l`]`7! YM͗ʜsΊ) 5<3E}_V,7U^ 6 >KYG|2|7c'_znŗaEM#Wc==Zo8Vfvx[z.ڊm{l,7J&z?_% ̭g>nT+b4QoKM9*s{%k=$Nd<H7,rX94ʇK 1bd-G+J'7$}u5AF|9lm㠺y|{m},Gz$~xGQ`tY,7RnzPY@VӑH OSb 8̝\׭[%ZPbkTҎӲnZ\ޅ\V;K̂sų!WG,XY(X(CY&ˆcIok0;ɔ^רԾa^@(L7t,kV['M\HJ*I7Q)Fty\|,qohQcNJ_v(6^gzȕ_1tpx 09`_`"Yv,y-bKeV˒cWdWI d*T{Ep޳{Zk]ɛ5c4U?i>ͧ 'x &*D褡EJi'M#CQ;iᤇ.w_|><㙜+:Q[ H'iMZJp;M<3!U:cƥ.(]wtH>vh7T }Vzm t(Y~B`2@@ʜ)U+Y=99@X˻+[=k&lxDӼ'Wɐ^=dX;96t't|Z\j9Y*V?_\>tԐ&]<CYt<זtyUv(TJDzŘ-A@f)F2;}r ԗ'T?c6\)bQptg0TJ*i.lEAFkƷ^rDPRu3TwH /3(Wmk/Ky:ni~bW)ag+ϧ}>~݈ůoTh.y;ަ=<1l'ެ=;һOʎ2{jM_Ԛ}iti+`M^Q3Y6iuꌥy[n1:aZI"@~dgQ7( Lg{Ch 7Qv$tD`(LБo!W*>1,8<%mRY ICLlӥHEgAԍ%p˰pƽq9v(|?{}~=`xtPٰ--2SdjCy2,kȪs(\Be]rLу õ4vihP&f @xP-bLV@HN F&H9["£(#H|s2TE\(e,ul,a7vw8V[!WD#[[t| !NflkY79fKf/}y5\χ+t-!mJ[bBy>)Ri9-b=#)f<+G;:Ƶcx٧+ߟ1 g>S͏'ѼoQN9BZK9 Zu'8,\xI59O_|]׼kV_Ye=75r_Zl\Ѯ1 ͮqWi=]cJv]cew 7A}ZL^(3^~xR֭@ @M}D~=Ҫ2/o5y`aO>/pзTza@{Kd)~YBS2GT+i^;הټ87/Ѷ;.eH؁y֧@u>ެF .iZsjY,9)a'uVU_lJ5c]Y43WMj_R^([}A6ګnۇ$o\}5GbW&CGSq!vOo!bX+X{w4±+;twUԭGtW1U8qW,XUVCwWUń~we+:"wqWU\8`Jk*]*q-Ww___fLrnnswlWI[*> t.h#@.No:XV=c}Р,i%0 ꥁAFyH% Q-&Ln ?()#c,IoK݋::fU 2Kk^L1f(Z-PJI圅HS4TR..)D>$κ[ am.L@LЧ)|42`Hd;nXblNZ#:ee$d2ݙu[J,,|4 Ɵ(bPlBQ(b@*VaI8'Y$kdqE@7NI8'Y$kdq5NI8'Y$kdloy7/|E b{ޓ];#xۅ!,Ԣ@CR$GzŘ-A@f)Fƨ%{(ԙ= S3 -R"(^Y$|E0zƘoo!g z=_h.,uŮ]5O GT]~:kR]=|:*֭vW8 fM]k.|./jBq׽ \39jJ+{ G̺R(3wFMFB ϰ]U/}gH/K Z1K~*xZ*ISԖkIJŚU{69Sb:CeCl%$2k]&C&bF{VO=/&z9v)aV + 󱊘WѓXZ 1E5yE!́'%P+PٖWhcK+ֺZ]%qT9R@t:Ji j={Y{D)?v?7o0%tt;JhOO`pup>/ׇ^` k8QiU.Y@ ٩]EPP)HO_z!YڔAu!ARlƛ.2zf[BB*G-i EL!(VC)ڹ(A)P YB(N2Ҥ$C>PiY̾dPF6%([]z㋫90\x卅OGNiWUd0^.)"ixuOx5/:7jl?vmAk:̹SƩ.{n.";t!Km5Vxv02ϱa6lJȥL:,A՞GAI1yLVL&[h m IsѾTJd 4gث!Iqd;fdAdm&Ά6wz~(޿hf]Jr:Zy-W-oF-6K\_Zox&[,&M|P(aa$EI6-C0;heHkl! |A;u{`jrsF+%x;Y$E6>Av`׳pϵljBUOC̩XuwY@ ~MC>9BRSD%`P{%Q"aSvԴ!b ȦUf+bI kpdQJtmِ^Bb`<"tȺd6i!|_bZЉ`\(0Ecy@Z ?;_q'Orڼ9Rr7b]9 &$')50/>ד1b~k0N~e~dTD|'1NukTTzPP+qzR8nOGcv4oE`*)IHø$x{\\)݉oy@Uh8Ki_/c|OuR3ݞ\ʼn#OfכxWgu2(? Wi?h! ݣ]ӗ+>֚DsvLOeFq\#69/7㏳~^g|Xgc., f?D@ycK w C]2 + x~콻j.Iv#t2SLHgf{ PwYf:`ӂ0ւ* B]滨`qeSnq*D4_Ӹs2+mh~} ?C( SJ8Ą-3"vCa$q3AP,ⵅ(:CuC]B|AWtY7* ᕙlvȜaxfJd&Hh'06N+;y\_JxbIvdgmuþ4;Un0t}i 0B0/O/m wBl3Rbt#A"[[գ_Rڿmt2b H,<`H,Po*$4NuԨ8}9t[cn~\]xl/-r7k|U>}̐"H0#6xm{Yc+%\p o;['eMjH;Ic=ŖQod`4ۙ}|KwԺ)3$5i#`Z(ڌ-2m|̒5;˜쌛7͡+oJAXHomО X_lSCzCHmH vVt_SJx @UY@|`.NJ_ЛUքMy oCo0J{O KW|('o"Zlzsi8:l&ڗf2Nv &S} ƻ::9gtրԺvHkOHٝ'yt+[kq~>m-ix`,R*c e2a)$ŤC[+yd@BI<ǐ)8bTHU3 N{E -39$.YPHO-5C9G2hX>rɀ 4d Xa}n$F_/㹡ne[9vhOaQj'%36 0EF3fX?p2AZdu91cp2Q1G Em:2,"1rbrX+/i$NEF;rRtVN:4QhGWjG'̷|ޞGTίڼVo1ʐ&ٌ2%p`B9vy;e3tqy[z u:/SX靕illJ&_bK]u0 jzL,c05%a4UTsc%bD|A3FwNo#Oʯar۟FmcwPaTAarQP&(=(BΊ9S`ً?G\i{@ZI.g$jz7-7,xróB7Tn6ʏo,v^E)zEO{eezjm}}:O>_-\wP$Ö)24ό*:/A[JEK9NT&J+M"=\B4U kVXH1Kͥ c0 VY=ni'4d+|y~|<顖&l[QwG115WH# >1%G|eX\zyRtO'\l  wuu?ڳZW]݋Z~ԕcA4aGJJ|,*Q^]aABSyD xU"cQWZ!]]%*Տ$t3r!\0s,mTB^r" $cLK21<4;@}eJ^ C5'Via"]eڢ? |֕^8̉:8R [iw8P8J#&wWUTG>N5Eˏxf[Ƣc\g8oSg1tf8+0Tu)2A*8ʝ7\ LŘ8P=d:FY^1HIw88jκؼ'=BBXʟ'@fzz~(HZyA@= ciʉ?gl~1I1eړLZ$3f*sKnQu]`(c,e ):B"q>82$EHkΜס w$ )[@c$"﵌%6MVHKDˁa)N73p9V2 IԼ~k n=ߠ5b8E*A͚>װft]›w=}qIY r%ieD]veZjO bxSՙtRlbR4 >kvU{Y6r=ϵ\hEyf-$QcHw5Sux"Nb-bs$\Ȅ Oc*5(K)µ߂ȳyklh`OP Lne_y|%}Z_K ?͎Ж*=Zu\;r{N `{M,h$њh.ȡ kn$x>} d4Y^1")wL*"Q* J3K`@i^2:>+C㞇a(cWWquqBb('U$0TXЫQ{-0"ZpQU"؎uX[&!p ID DU  Ei \g\4GZmjs)uBF\HʄGB:Hs}j36|H>N=|Rܳ/]-D`%gc򈜇Io b$ ̋OQʸ/&˥~R"ȟ/{{<2(b{ d); B 9B'RP;1y.$< U@0! H =n939%Y%%. I"cq:gz=nzWP$*ʾ+4ы'W?ʰ&"Lc'nV0di4өk+.` *8ɏ[sZ?'o'l 1T:_oa/ʵzs"`IvzaKPHy$Ɨt4 iFa=Q`*|4 _'&'ydӨsTTN:B-#i`hR9KFY=2Wt*',;=׿vï^u_o?9{&g߼]'XIp)ʲ`xT<;C]o =껩/*{﮿Ko7uH"T9j]iyU]4-Cm-B T ޅm 6W:qYp .}B`@_u:۸XAܼFzHvOP. Ô1a<@ LA5sC3ILP"xPjPkuЕ0|AP^ Fi'Y?C)ؐ,YLG$HWN`+OlS4Ɏl謿ql{sfʍ&5-UaL3 fPhQR:( `WJ%kImh,b`]Mt2PXs_gڳs Z 幂my?2nCX wd9vd\! ? j/ʩ$P/L8`Y4'z&ndbK+mHE.ty5{1cag<%ؔ-odI/IK*حVW%QY_DJ  Zb`FsYj˱S(ن$dr@XHHy)02z jltUzf.9S]b6uORQJ=ϯyt^Y5v'wdl*S"JY'RRjP'cJ O [U3*QҲVhf8L0Rl=.E38bJ8$\DfFvXT=ثX(6//hMj{s:QQG@\߀bθȩ ^P:( 6 56s !ea,9(h[5$,K]F2@Nhf4 SL4( wT1HZI;WCž:d ɘ5Rh J35\G $"Jh4Psk'z'iuOI;k ۃPTf}.3ۺ) '9R8 !Nuo0-5 t}R+ĕ0Xfb& l&Zy%]\IWs=V,BZzRru#$DgJ Z{Vz-Dhw\b+HJ!׻aۄ<,ҷgZŮ !w?nTR> ُwƈƎׂ) υJJH6(+SQ8Y+uJpLfo*YƔ$depnrrLkeP¡P@]e Fv ^ހn,y]NWWnuAڦh-vEgYɲTBULÇG>֟V0!Yd 8 486Q΄L΄<~ dT@)AE`]Jzς(rZf!,DP*q)!ddBuy.dD"#=jʁSVr2Fv:or 4X?Uh_Uk3=/Fԋ=Oxc|:Ik?WDˌpZm n:ZҚ4jGYz-p; o؉i8|T|Ս~ϸ=u+wEՇ\~uZ_jpUT4Fh(ST e[p /9krf?qW?L ^h` ^(_n~YxoN.bc}WȳeawF7Ls}|S] ~EM^L p0jG~=Ĺoh\dT{8v,ÏȍwoR z^굒0\C>J +؆)K3HN/#ߒƫl9')wgh1i4Dґῼ)jK*kɐ+2\0j:N[Δc\v͟p`@P٩ AFic^9s9zSGL* މFB7'OӔ^S3a~vke[}1 (vv#cUo`FgZR5Q9'`%XI\&cb.FDc6!Zu)DVGP0\m :8U2}tІ1ڂU1q! ºi묭Td40Ϭ1&d?\+Mެ#& R4ZZ5"ťxh9 :@cT" Ut }^|HfD6)ƹF|v2db1Jq+`d҂}v>ЃMh"T~JeAT !G 2.1 w4խ#CBz#Y6\j 4h{/pɭcBpzw>#.? QApcG=sB3*PDo#|7ȼƓCX\]XnaˌvkgC&w7 23??[w@H@qkԾzwNԼA&M;#9h8Pbw%P&;u۩=v(JKU_SXZ0@KfW]S/_[^%hc:'(hR"f=qjV:)}DPLǫ5əsꖍh;H& Z&FJ7x7Q R& 'VHJB:F5^FM/Eʠm*Ӆ>^]"1{r|hyQ bࣿS}g E,u Hg~:ns»^SH0[;{z.&`Oτ;Eetrsr>podY@3k2{KayFvNFV[=[#Z8b45]2äwrylw'+$N0'Ku'kÑ ţc'1a]wNJ-6l n[C z: ҝciSrK.NӭI #H**i0/q9r+︞*lG(l'd1r8l02=66-M0nXɨ=;e2wɌ%*eԲ5S6|t;|a3ųy.T۬ 5I^k ?lxviR9P E o%I;O/,bbgBxdH"0U@MVqIN$xӮqDžCFk>^Yqe[|XȢr= r$Jk^>"l:y]uҥ,t4 ?RB3jUld!@* ym `6YMBQ8#uUVlU!WsQWZO]]!N]CuRҜB5QW\%E]j ]_JáSWP]iŜJ6P^)^sΡ&ei $EHXvbΗ2Qr2JH6)x,U'PRP'cJ O r0Z}YO4JYۻ"lVq oS[?>k9^pVf{~ٍ^~$__]ƮX[Y[Hm&+'y2q l6gLR%cQXp&E&i1,$ hܥ R`%QX:fmuJW9B BP #c4"R5KȩcssWѸn5uHDJXw "% ޓǷBz@1d98ՙxՐ[^JxwG[ ƒCuj!V yμ>}@j}Q[j m 6nݹq ^m?MTj41 \uUSI4t$@x-B,}u2zj-c/H瀳\R%AJS8˒6BL(93 YDװF;JZ_;gt":5j(1kŴ`lD{ `aِ>8u_==eLVI(;\%RlL8OHPZ !ں=euJ.ܙja6 _Ehuf(YJ{*7T#gǺ2o`<|h$/ =lOn3 Fӷ&_lnm3I9pnz >ݸktkIkfȝ+@h{܍d ۜ>͇.I;b6n kWIֵv>qpәFnwg?+h!mgLeI'd=sOf5kN:tۚqOpRk"z'xnSs\sS-6?mPR&] hgkPo|1 @)zk- juzK)S4<&,e\GUG!I&J朠CTֿhxv,nVg`2umb3e9>G g[R3qR$tFTķOP) 3LQY/+3nlcٮ%8B:\BV[4K}ɱFH]mSPE~X'i9n5>F>"dhtn9( ;^JN"e:d>>_={Xˢye1,[n,^*ٖĄwxIƬrgf+*PoF76w/L*&.F;GmYFJFΎQ8 %@pkŕo3֜SOov+6[ۧb^ZftΦ}I$ML"3DUtr+E9GCǽW̆-2|GP,oP1pZ@! X͘FH!)Q jP۝y'Asa(dܦJv7yº HMP|M@t& O&9z5h*6DIV%߱Q0jkEbrsCQ3GEIY ( <"s<ʂ*%mJJgHR e#!R5RGDS]nI~L7K`,Rڸ ^>j!m.ݍ0)_($)~͑=S ' f\#+{/Qx](be2׏A; t"rtNSwdpnߎGo%P0 (+:mqk4&`[]u ե}0Ja,fC/L1_[qa*AoM}$:99_/>*&"LwGnRuC:h[5Qt {l!a~Y]4tQO"ku\[Eyt x>koӳznAY}zfcP[K%׷t ioFf 0a%`,|8Nxf748[[%hsA:VI\lHju8r1ȯX c|a}uJ*@ozuzg@o?͓xOyz/O߿yv? LKN]7o77wMW-M\3_*W_ %koӄ% jrPz $2~=;/,z5UMH֯ YNUl˵h`veCnivݹD q= v.<_ߦܢQGIH+vV@FQ. Ô1a<@ LA5sC3ILP"x\V:?qJb?A7tY sX2N`3 .R!gI#I$+bbuD>xGv}n^ߵ]z0ʭf+NfkpU6K-{%}w}6c`,#a;^P|Xx>aHCVf:B(W4 گo#YFipL]νQ^8ٿשHS~h0Tp,F=Q52f%&4 '9CH2.0V ש*<,)vGEKy]1P))J1,,VJ\Ǫlg΢4yՖ;Zumgtr@wL`bWvo>S(N֢_)& zSNRoW12#S.A ƜXcYwK(Ndagq]Y;<½¥t_1,Mɯ ~Ɠ TȠI53l{$ѨN0+D{ACJGt`ZD"rRuk,ˁȅ4*C0 |@N@B20D=rAiLecǴ Z(QHLd$&F:łKsǍIj$f.]Gwc-2eN'kSt<^\'ܳ0ٛ"9ݘ($>a; Q gf+ϛg8ҁ] ;iU%㊖XJaQlv19vfbE-ŌsYKA |@bfRd[Ʌ&Y3<α[`RQsӉ] lҷ{Ԭ$8ΗrrV)ovr{XI@ ,\a Z;)7 R%lRN9ịUh4BkM̨3hQ9b0XJBSELYwsZ8%m>qwrtM z0Yޑf5Jq6Δ>:P!Y )dsK#3)>Y*䡦_{iYA32ibGSxYYy(S!ca@wsN"r [ ÌМκ6r@Y`&Ƴ₼zSQ"!ۆ1'i  `*b2^#J1%.LqUguD 0)zMt@`RGTRJmLj|,S.#Ɍ` `]fk iu)"[~\KXR;ĬW0hԆ>`gNf=b1V+[=,y$`*$t,9A6a|(/#P+0ɕoW`e.f f6br8S?ڗEˌpZn-jF[6#;BZzN i--ڧl`+D(R!J --%L%+h oNuwGi7Ygb;m1`XxLR(xADU%pwdh_L3<7hO 1Ǡ]L6x\|U {mⰭs>DYeo2$*惰gYe&10w[un}mkA45-=ee w5331f92S2SLՙ'iyycj!]w.nԕԮasn LNvv!O7^4Rxt6}vr~HCM }y 3:&f Tx.45BCxKe7)l H:spӸ-ԔN[q*2~7mjLV)LTl8m}mCCiʴ-TZجS'@'}]}-fǛ7]y{aIg?_vIDPRAYd:̥piA L:N1$XMP6gĜ=Q͇šT6RHzna˪ײ)$MK$udA~>LRi|܁c4 '%LT*yiV)e9`/xT&ʔճ<\B4U)՝AXk ň^:0VЭ;&JqTQ( Ȣ`;X+\$JH!]κ.$qL ن3qam88 <"2 fJ[: OއΔ-ȧSgja3IRǏ^Óf2Tѫp>}Q9f{!0_HǗpqg7-=f Y@zZ)lp +>̥pUIҀ/xP[ l0w8) '0o$Ϧ嗩"NJ- x]?~\Vu'͛_Tt 5}[ԙh0 4P<}o.G&HsegYbٍQX|^f]Maa@.\ǣ_5brfV\Ys%秓E(Six`,(U2xi23Ńde9G׺ZN$u1CCGB@6%EU3 N{E j{!z5eE =pH%] Y钃 S̭I9g)1+swiD@GGI"qi0/³u9-J-AƒEiC FtU"ܻ}Ѳ}vcg sD{Go=z}x=;7Mѯ]jb w MX5u[lZ+I998ȩc.XY#p&'$G9!iNH愤9!iNH֑bfM&sBҜ4'$ IsBҜ4'$ IsB0o9e'+8',e} :R9: n χ0ꥃ}W Xãnbߩ~E5GJ%ޖF\2kJP@+dQ@K?>]@'EMszw^{G>U1*o~nVE!5X(-y pNj|vs-Z`v9VkնXcea?ubP1M?if"M ;nJ=Rj ;쇲Q*[.y \ sH[]c2r[K+Ǹ.OGq`"6M}-(kurUU+b[յRHCRک4}W*AXM~]`nY^N9u'Fn ĩRW`vM'3Gn44څ`ī_v:P!m]ym5t:pkTi@aw]F ex=w0 0n}ؤ>xTN ۃJAw98Dz%;Pҹ ֑ErΧ|˔8`‰lh6SO&  =}p~0*MMmDݧE|S UוopOaS)7aSw/aSo9;0ߩ0?`oXzr ln`"nˎKܶ5;@qq@Xc{Q lVAj-lOd{!qɡshlu{ZB69cC;Ў9cC;Ў9cC;Ў¬^JvUcjJIː4P0vNID}k5jWF gۊB-q|rRX!t>cpq~9U~QKAHÜ-41f \s¸达uCz*j7+ EK2^u6T[xvVԣ)g JuϚ;kV<9 |,Ag<1wט>b:%al_h̒pH^P@dR৔@HfͮlB)vNeW㾇./k.lK`OAOu $EDiy"8(V#ZH:oweXs6 |[4qȀgzL8_Z{; ~)}ŬQ@q|1FL:;C8I "DdCe;v;Ô/[ufp[Tд-QwF]i> k!15WT4#ǖLd髒@hKQ#Ye ݇;)<ޢY7oq _4XP@jIPE"eLԴB*#@& .Khl=Gd%YZ{[ģf@܆`m]ş V"Yu }QϳW5WwFTRyn[&K^V3I|U>n~n&oJPa1dJ-6mM:<m+ϷvFTU.*tfM?Jhd0u0 4tLGs ncJ`aءGɠMŴƬwZ Qd0%I3."fzi-ŒТ ED6@mJ eC~n@r9Vp A(L H**m􊀔5`}c^'ægc BfV"b%*cIO Y"GooP.zU {CTE#HMd$dj"0|sPu\GҡH*s z193UkgΈ$ho CPAR䳗]atu2]r$jd-ҵNZΏ]4K1D_b P[;i* Y0m@…Vʃ#-.&1EXWGLčr,K9{P7곱Ĕ5rUj^./2ciVM2.#I!:Χ.P,=(`J*Avẖ+zG[V} Հnh˻iQXĂ-NN6x1Q#3%⤕ rI~'_Bc7ovfɷYw1vu3Rsz;8\Io'O@|8J2|ӯpC%7^V@b%+X J V@b%+X J V@b%+X J V@b%+X J V@b%+X J V@b%+~]%DjxŴ1Iry=KL+OHiS ވ'5w |.&|/$~&=_c.>S:̾cӶnJz݇zIkw]}.n7l_\7߅.?Ҝv?}S^V$I|8n8n^e܌Г|wQv o_VaqOmZ0=-«,ݠ祮8._O\7GqYs,!o|~aqS׈K1URGt@\F#r\E2ל^S{Ιn_n?zzx,$l7i{OeX=ML ^MؖݯX-';W-VGƼ\^نMʋ)/r~u`Ww۳ kioKYmįCxrPiXw#FF_OE_ۙ³P~*o/x- XrƾKKeKyQmU<#8v+N(oggT'0:iQYg~AG+mS -E*Fo>`sNۘ|rOqz|o{9ߨwMഫi[ݾ<԰=c}Nѩ|0lzhZ_L~H'oAy?ۮ1wn6㣞u={+[jɾ0ט7tZnJgƯtf2,Vk]@jyOϫzLwI&&QLx)[>z_U_)3\4u!J\?Ĩ*!6lMAQ $]5.,P=Łt{962t/C'徽F][h|f׏|z}irQj%N:{íׄx`FBa_p׍q_7}ݸun׍q_7}ݸun׍q_7}ݸun׍q_7}ݸun׍q_7}ݸun׍q_7}ݸO݂͝02ث-6g-#6ƿ.{ܱ; jyQoNesiM"j kP>要(*MET&^z֣ u<0K8XOwT"[hZ%gYz[Mх(Yu JEL&  iAwG8wؽv?|+/yEYT߭Uf}w׸KdBZ/u}\w}xCܣ<˿3uo_Þ9{YzYp.j|?|MzFq/Yo}aSuUfF}rg!f1|FE/%vO_Hs6E/8oN Sâo}E,f7Y͢o}E,f7Y͢o}E,f7Y͢o}E,f7Y͢o}E,f7YZED߲^>PyRS(jBw&p?h;r"#]/}z+HMH|s(ՉsQ+4(JP] @Av#vwl;̜zkʛ34]\_}\͋)/)//V7S?OV 1{Sm=2 >z=f>k'yд2ݺ2v tx>'z3:țm< W_=m><qe_bW܇UvޫzJRIB%ѩgwq$qdstttz;:?.TWwHR;6*|J F 9P#q$)U$dyLzELYw63Lj̉;tJPZx NO#(=>j͎56KPnɮ'BTb&3dFa2#LfɌ0&3dFa2#LfɌ0&3dFa2#LfɌ0&3dFa2#LfɌ0&3dFa2#LfɌ0&3dFa2#L>J-N RBnBp}Pۅu}Q%`zx@7; A9C*n oSʨfRz%([ M8>PgxF1(f(f ztv f.z󥹙\lԽE;y<3̜ ĐRP<vBSn@J@DM,씃H됃;f^g}|&富3MQ#ٲ|*RI/n1eA:N"5Q.y!`jS }>8`-j]@UaZ/\؛^׮fnf2T~1H ߾H>q}N{G^=bɘL|X:<` $"F5:HmS3+ۿ=O@^.{ӿ@0iC(X;39뮿 w3xp.QQ!X/C/L1*Oهp[\J43rS^%^{ǛK%քR)aU|ri4ⳡ7W\RŀNUr|[StQ?܌c{}7r3T<_ϽA..빥z> " 7* f //1KWm!3чQ0iN>o=w՛_%hWljX%s3N'a#5ÑKMߌA~"y).=924*&ԍ//pҟ?~zwwv? L.Ƚ:@m+>=깙Vҷů7WuB$I\ϣ_ 3p h搜 46! u5!0 mBv l][^CguWub7E/\E`V𢍱F*=9{wQp A@R+ B}*Q0sg c=%ʆc{ҥC:膙!H;ި<%3(D2 KɎQ 6$ RIWN|8Ln$Vب/u6tFw hKWgmwY{;azsKSAzUaz&-(eݣa5 0_%h@Utn+kq3tB#WRPJPR%J)7x s@DAH"0% [A]]|af}êJ 5;9y:Y@ <UX-zg՘1P#^ˈiDk45[ʫ~1lṿ_8^q<< @"UK EPg"*FlFJIP*.+0V ש)EMκEŨJ4I[(f0ożNv1o}mnj{hMN" Vkͽ -%p<%Ս|Hfv@Ǡu,`EY1C\Jg\(üBS(N" lG RQ[M*Ffd j0ĒKκY2RDvFBޱ,Y>,(mrgT߬7oǮ]=ϛ}݅0F=LQk %IQ!`D$R:"N.l%gQ`Y|M:QY1sRw$0 r Q94 J0K3jًx;LVenYdU^.yI3[eKqYe,!]Zn5,:OhbeP{&:ò;[YjAJJ)lFRK6;˜}ǡpɎNEة NMKF Ml)'G$WJ]PKT\^|3)5A q4WJ"Ro9>tqz)TWP\ N:&c0uXvݥ:{[z筩<osHP1:8xNSo;R/ec>K@Fޘ]Onrx?Xp^rÜ%Q [Z4Y'˂mGKqqc`b[9cfF%@8C RhwS r 8M(rn#"( -J)~ &&c l!Ypb7 uHk^2'֘cˣ?}"m&-']$:.x>gglBM1aJ%o;7a3~8Z#"[ fRSJFi6`k>U/Ca" h\/.@\R"+6w1H^>ɥ/b2#&9[VuԨSl 0u@홟>n %kC TEìz^豘pz ><fAe|s|%T8 yP SDc nS"!%K9 Q ƑN\gYfxW Nq zLɠVV< =e5|lc9-3|y8|Yߙ3Wpff̘OǘKu:{bLL1w eGw7vbI[N}іڵܳ;Nvv!O7ongzH_!2GeX{֍n4=iyDʴ)R&))T(wQ̌ʈ"2QNʝa$&y8"ni}WT]ьۯgfI&wj6st.O3ۼgY}y-4kylڤØwcJ46[8;odz.nvڅJ+u! tҡ?{iuo EM1oKl< k_z2&4z4R%d@2:JTd֡zD^ ^`ps{43i8gV7Enm<Ϙx@cbZ;fa˪. ~CS(N_^wj/iֺ}N/h@T}~`A" rQ DֳZ*`?hi?g1V1$ CKqK qDc93]ΘVrƐ$7(d;lԓ&{h> 騧WV >ؤe݄u~ >:H4X if? mdN7KpIT&0SA<*dTޒ^vDU Ø1KhJ !Q+ &nRy2,w)Efn2'sNz6Aek5K t|}9o{s}?zHcږtk񄨏gn {, xz rZ98{=%nPW]}3"\0nǗ?Ei]5[9cl~DZwnW5_l܆Gټ畖x4~e}k~4{:,Y.u+O kҜ?U8ĝ}\:K[?n OG!WR֣P̩(T|zyzWAˁbe%5l%D*Y"K)dF;ccgl쌍'jlJ[L`"Y>`g&<T:q&mR{ ЇmxtMLNI9oVO:Dqn Jdpmjkd\Md ͷ;-D{Q{SjWv$qfgik9Go{ͻpy%p x:]hLD<8VbAE)lٹv UΚA*pRTVA<}-`*B\TpD0h;Lm` 颫V), !RږiTq%HH<3WeX$:]2ggm]qn߾'=ɿN i/׳bBf_?7UwAGzBy2_7) ޿7?y ҀUKYҩsGqT匓>D G)@1>l`c2QJ[=v D8򐸊.rNgTL$[T Ykäenl5rRoczq9>^u:Ѿ SZ5^KBJK+PFTHr5>g.p;h=c' J ]HÖwsu=1`Szf6E[41L:stPG2hC"%T)J*/AV6!Je2B$ā ERN(H3=$xN%hmF΁ԋu~*{)_MS(NJ:CT9<}X|!vR['ZzW.<9x%Q"OLD`\!$Ypb-dT8'Ҍ0GAsA|eӅ J~:/LSF0FXa l,'2[oZù!@OZ'H~_ X/qM\E^f=dĔCo&a!y +bƢ$WKNs&VFCeO[5H B牕,2 I3i1k} Y1tdMHw)m|i6P]>ԚġaNT߁ysa{!vlQX 93p"bS,sFz8F$3J K?_̤ϳM.;)"OX< eؿ&%θ%eL]s2/l2%!xqWͿ[]>2#s4hL.`k~|[o*ᴐϹ}1;{C\[ޥlhf_WK/٥;!%0%,Ԋќcţh %W{k4kk3#Z~mo|w5}7jm F ?̃fn[튡i\.&w×ض`H6tm}mX#&jE3Xfz>ѓtuTv:mn}V#RLʉy hm4 ɯ+F͑Wr:wtiM?lnb-ǿ7ox7߽珿~ora߾Wo߼v L1ﭭ${|5i?nzj-~Rw}KM}$QNB9rFᇩ!i]4U^umB5!4 xڴ.fqMeܿp\/R-B,amviՐjVu:}bO$EIŬ=X#A F̲̣gB&"W蒊  -tԺa.]}8%1,YR'?{9w+VeogAaŒ8\p ^tED/1C h%&j<_Rh0f@i.RZU]qqML]D/?C#E@*TZ:a6F~qXRZKζN[VgOTV{n_timS*7d#3<2:,1M%H4t&h%bhB"N)4Z yV=3bYie$Q%(bSj5 L&uY8s̈%vka4L+HV]nYjNjw Q6>8J)Q!1E-Pب aqd%J,6lT_I㱎nd1m!T p \)Fd5jn$K" X-7uq19IsbQPsK/juP<ջ|V:gGrVL#8+YISדNjѰzQ-x-qy4Vܦ\retBHy-u^K]j蘝QRd,)$gʁ1%gS(b!Mլ0lWvcR) {9c[N>;m.b]1Ý.IԘHiDsQ;T-RUlN\;G ޢS[r@f!TS|99x#x<MŽ'_]] f0fX%7rYz|dӷe26FbSy;sAgM峟k?ysi*t'zRiX<"K˼] 3:ֹR;w)EMf(+Fj>75r*g/_gz] B_#ŀtoΚr6Z$/ ;Ea-fn\/z?7OŽ;&5k,|x8i {jr12M`a}~֣ػ]/f㿞h#ԙ&qZ}IN\ eVI)C5+S#S}SSObՕJ"Zq˲*Q|`*QnbHѐKyji¬@ܮ,Ӷ Rd6:*Z9?U-<{ Sp S &u䨳9 h P[Y꧷j1%0Y(&`.#2 `RdGѲM5rI&G'T^Ѫ6cQͽ4ŏ> kds n_5xp?.GHFTi&G06<1hvԂyQi|[|뮯k˙|Bٻ6dW."8ȱq ؗ5J(PS=M")M2]]UUuuU"ɓQW\!NE]%j5:vuT$[u ̺RW@`dU"tUVcWW@eYUWL]QI1#'`NG]%r)9uURV]BuŤ??~ud+A6UV]BuSBW`q:Dzt$kTW"0hZzxZ}`0ן~o%sEhw("p\@.ΰZfr94JL#3cO1"VxM*43JtD)D"^PMvBl/ˢ)`|0VmYV%:so4NaJ0okJ c+E0Qx)(Y+Q^zvEGAg&!;S/19۽i2lQ/Zql▪4<03mC<{0/a^_79-2͘A6) ,-$D!x6HH^( 's#Eo1{:5`r>Zovn&x9IcQ0=rbrX+I46}1r6Ce-tLj_ig9pp@|_\3՛` }}$J4h֦eno7N&` yl2a(SҚ &ӎa[س{Ker7lQ:Kl_N}^mougP)Ҧq(2NSOaM5k-F"{Q$Nhܧz_,ټ=0}Fk[o;yQ lyq:ˌWY$'U7ݢxt_8}|c9]M+7G)7{KO-, |&s N%7s2 т)32POrIse[\!`y8\[ ,XPp7ajcrK.Vӭ2P`Uq$%p˿WT`Fxw`c{ Ev-^^;4\l7`8~)ElY63I);3fZl 턷1j6QyE{~@v\Xthi*m։&I+M"[̪εaې+Gw'P>H=JJ-E\ G) ̤ *CkX8Enc~Z7nT힏7ĄsW״:xva1c l:z]rXnTUE Z͠ RH0bA@ z0I?n9ɰ%$cʠL)3㵊N)+ۄ64P4JJőym"X p Ta&AXc ňTvB:0V@W -h3dk|yn=7{~t:ʜ:M`-ࣅ*?LTabʴ'Hf:)U椗:Vne /[xyR73f)cH 3bI̽31s5gfL#QHYu2L- n&y豉hj4BZ"彭2+ ~8\ߌ37ox(jjΛoᆨI$Dq_)l3ru o`8.,!VNH%"ג6KeCńTu6i`4v cwnzRIakzo#xiq:7YL"="0XcD$/ ը'&|էx4(IIɟW٨uBI$ޚ}4L1"M`R,܍\8 9G^X;kHE:gS4a.E*Vљ]'8Wʟu mEN!} A w30ngL}˫uB סCC~CqqY6Lr;}Ok&H+=!}UgrudPslP [m an-:\UiRP)XlR"S2:7Lc`bҙV:F T/S2颫F_p IPCQ`j+$S1f4j|sWLIԠh-gr @\.=\tc}8`-m$\.% A2 ǨU1k^Jy> `^jF5#gp) 'ESIt][kr%\owark.'G2,B6iD?[i{`b?X$D+ && r+abm9$0% Ff< L*#R"%1dNF0QiK`@ dbs,QXy,*!Ouj^a݆G\,Qd漒* z5jFD .:brr۲A?ﻖ۟kkF|0nQ"Dd@T(0\tC xi qE#}aAzJ>n0R ddKʅLh})C4>y-`j|܄2r#}AI} Uy\BᎽB$\)&Uw_S(0 I: ̋( C>4."`ο'=tTO SyG^EESgɘL; B"r>DWRP;7ynm<==Ϫ@0! H =n539%Äte _]&. q"ci:;? c:}w;CS(r`fT*+4;^~aM(E2]}QkBr aSg֝If&O_g+|c{u=[gs;(V|7.K^tr{!T7# CvyfCP(4i'{D1񺗓vT֏Zdݨ*.q ͸kLtϵ~zq>'ܿ8O+MS3s.J%y:v  B0"DDTYlI!$i+yCRWE.(q-E9PĘ$S{+%RaZclp5*nF˕<׶vUa C=i%nR+o>e HR,\y옡N.y ANuQ( E).RL@.OK%ٻ6dW}]`Kuu<8MXX`/ղbTHʶr^tFđD#ۉKJ`)D4Jɞ7r6#Zʳ`aoP,4=cQZy=\߾e,G_ЄhqtIOR-+,)Du5Y`.h)hHaexLҪ¦F"Bg[cIT3bFfĎqb\ jEmq@A㽬d3DWBa3M lv fM$SVJ\miARtF[ȬJ`+:i5DBE%dőjŐLj!􌇽[~O boPDt=#q@^9)Z` h ZU*DVJɥT`%wև׾] vB(6x1i Q["k'~W<W)Q~\;a`J!(pUUpU8WW 7dh,n,s|Z)iwۡ/"nXvi &#OfYyE)ͧѼXf洬TZ&5D8{L&& `1)<ö]Qm<\ѝZD{&ZV wlJj燝cg ]6lHBi D}SO~c-vvqyd)FUR hQhs*NWaۨ=̠AdYhO ʄNk cbWr]E&vr~&6c> bQȃJWZWʫf\}5pU+Z< bQڃJCJc7*F pJ*pJwWL-;\U*-m'++aT:brCJcvJ6\}p2&֭W`޿&WK#v'W[ue\W-%+߅y+Ng/Rt*"U!-a^{F$Qzo, D]H47oS՚)ӋmNL o~!ŔW:@RLd>C Pv(E`B|4zbe /qffMZc5Aח|6 .J1K|<6̌-&$).='F%. ӳU.\o pݥ_$g]jmƧΜ~8-n(D(+0J̐'|Wg=U+ՕUdYX*Fǜ5`j9lPTԭ>J'+X{(}UpJ7Fqn[QRcv#}dì{X˅)X+ $:bB3\&(K4Wm[`L8eYۗJ'ګ!̤2{˒г9[\X(ݗ8WW~]U}yys!Z弣=ަhrc[!;+)Adf6QQHÚ%YH p^y0ۘW\(} sWxh|* mf]EԘHZK딅du<鬬$B]g-T{A}eӘt2 zMXamRrpV6A: qkO_; (Aq`gai2pS=LA)e,JwŌB;*1$%HU#k.!T ;А j 2T/d%*UM6!Dl϶'/ <ǒ&g͈'|Ÿ;3jڃ{{Y9ffB_c3$A ZH8+6Y3Vtdk2KȊ#ՈRQ,dC{#g3Fs|}A㡈zFD;  w`XϭFN 6&ZVlA3ujr)T:X F]ݨ/r6#b9⮧FmK,k1q#ź@jG}ٜN[ڪvFuAuE*g{j}*qt6vSWiFSJ5O ̪ÙEUDGRAy D\ U$ԁ$ѷ7r6Kz d^eVw ۮ,g,UPϫJ9W\^H |Iqq0Jh0 z_O}\pi4З/`>|^\N\/=zz|(f/C™Bk4_Mo7Wi7'_xG^{4BYWia~&gV>ܴ4-wI穊v9߽en0{üanDֽ Vog6&^Ԣ҇^5MKjm !+5|\f)d:q{kr[ཧf H*.ie)4wub [빂fX=!Qݳa9ǜJ BxbN}˖_-7_{, o3cJ.8r||7N]X!j}}R,T>^xQot EMo {7`Oa}=~{oKipB HgR+/Fq&YɁ[ᴧjBV4Чٷ# n[697Fͽ7Uؗ9v ]vuB}UwTڈJ>RP"bTբ+^{,Z{?f'AF#x}::eZȃwrRh$`N}Y6 %U)@06i !@@9Ku*µzFc]fۥyBYvMIYraҝݴwjB >B1TY_+W9] &N KHZ:oe/{xye4Joo>>y0^3J >:F܉#E0;gDCƺMatJ;*jXfڰ|YɟO6פ!I|̶ŁtZ?`=INĭyQUgev*||&yk-ݛص!NIE:d ڟ_O[d1bahwU{:j5`3ܭx|w' г(W5l&3qCJt+WNܷ]5{}88~;S+%[BTz4pM=yQ$:Z)l_`!axJaGaRVN>z>z]IvP'kI(?7i!$񷉏DJY M5NbI#GN@@@<"2V1M*O֒D ]0?W$^c+\HخO ҵի1=l>K|"0w.} fy\Y!d`&\JߣǨp[솞4Z|3S~ OUP+l4\YF\J{Hu!H9ʮc:%OoY y܅GuWSe$2H<5Ig2ĒyNG1T]tUlTj#USXâp&@٦SbDcPzu$jոڷפ3lӉQ1dkAzm[Ey=ԫ8= |Jx (0Q\E/S6i Odx 8=`1)xϣg"0θ",-WﮊwW8(DUP쌻*Π"-ൻ"ҽAw%u]uwUt]ei5GW@p Cٻ*芻v])Wkn]I!@U쌻*]iݕ,TлtWJ.Emq "ULݕNuNRw]qs2U_*RJٻ[tWj=-,O?F9Hh)?1' H4yRH:PZ<4>Ij6>RnKI(J)r#K)y1&*[[ov 3> J櫙N?5 ֿ~z =FpxeEUbV_,Xlk/j=~nWz"0*6qb\RO^_?!¨萻ls LIhN 7_SJS$87_;M4W~98cV;\`%TI J'D(4AKh9ؐ}G8G(ܨqzpǐ[yϢ!i*2N3.Ҙ *GsNdXv2y=eʂ>V4~C. s{/LO3ߕZ*Ƌ#7$i{hkgjKpc1E杄byjng`]+žVgWǨpZ ٘|<ρ?DM=(xfAQ*[}5YՍ~nТf5CCz(=.{$.d%TÉn"4nlލm˕_lkpoq:Es9o 2qcYO,B@fn _R&Sێ 'i rj|@kp63BM {^Nfݩ WH{ :&H*τƲ(w*dd-  c~.$1e=Nv?fݽ '7yNa0흙bWZ'Qx&IODx]bcM<:ix>ݑi}{:rN gэvt[ٯ<;:4`"0z_1;T}vEd}4*XԱ>c#n8]i^h 0~ץ0+(:eZ΃y9 Hb}]ݭՉ{;UU{d_S5JR`lNE B@9KuVImChR(׸Җh餍IT3>qtǵTڑE5qGC5}MG*8BGrn0GIw6: Z/= G2`;YƓϛ/|qHy7KЧVFDE$T@ʨ=< Hy<;2;C,o"E-|`@Lg(( ;ȊBKY["c%Θ(<*[2Ī&1 |YlvJeMo^dύ'Ǘz=d6}Vc.USYQ)|_2&HWDAK (KqqBS锧LD\HAȓ {)KXLBHS$c0a MH]*N^ z$jޤ\}B P+ `}.}@| |;[O#_ [gVCA(t6?{WFd /3 LIyE۳6fmwx<"%Z)eFV6ĢOۇXʨ̈2##2xLy& yMwC &x'y5a? T`viimW\iE $l]w.!) PHBQW&"rlԒĀ\F"Ü˄-g`%ǫA}js{W ^9@AӌK )hz Epҡ{9;c/χ7#:oc'~V[7ֿn5i+zv7ƿ6b!YtLbڲe0.!Si  Ri=B1` ضIH/D B,^0L t*R!"ĒbS*EWsYIk pZ)Cb*!"Bc|b*ɭ5=qV}>~NzIym}<ʛ5b&ݶv[HPRiš&՝?3XӜra#+'>d>8#k1\'͓D$ VI A¿͹ 1'6h*cpO@HPZ)Bىb zAƽ(#PHV'Jgq~돇bz4I솷BnWIFQIit}nI\.s!Or=oyf^5L[=wj\>w]5E3d+ftb2ۜ=|:]Tw7|,ٵM2}M͓77;7p0)tv_2÷cm!V^eO+ $ wx.IӑxREJ:lj%ee }dqx.8/X~繄gKĢwdxL.*OK!Rqb"zHeFܖ Q>?AuAMnLW`3oi#qW|sשfkEcQFؗ:ŭ18)B!f e%8cqĜpY};=bV!8I>_Z9@ŶjVr=,t\2hBe#XGK(Gg|P,F1#6yDAЫÖ# '763s *V-alޘ8;왞?H>{f=ΒtCSkM>^m m}폞_'k|g܅{'9p@:-JDB"1DEg-Rè}2c+jCH/:~0#}+umh L+`Z#B 'Bk4-( j'!]T(T3CFiJz..#!8L̈L Gl"WRmR@(?VjjŐ"8Ƙ֡)1Qn )\`HH!|8#rqQ;ǂ viؐSV-9-I\*ͅS)tX#OreRVY:??cⒿG !5h_w{Zja}D6;2 "$zΟd E)$:~vez\{Jh71C׉ljY!yk3$#j )(!aGWL+)Gn4K߻= ^Akӿ"jV SX)֋K3q-vH7Nj1f-~YRq;:})W/sJG2%ٗӟk;/=2 ΛcGa׽9{!|ƹΡT9B,Nu{^f_+o ?^'Zgk}S{v^-]X(BU뿻^s{)kIԷ$.a(S̢|@C> {0bG6ӲUF6:}ɺVkUs;w9S KT!?z+u1=AÇ F79bēVn I9a D(ZDg$ac#!IP \v 'JCS>͈v̞.J1HMJS`)P`BGPNSrH1q{Ě}}w!ڴ/i}ɓ6ܳ;Q4YlF>ԛ"PG Seܤ i*.#i&>v^a0yKv>R~‹ϗ/l^/'ou®Gr0O`#.Zy7z{+*~~2vXi?IY);avQJGr+qig$,x'UՋL2 ?ȳ@w.ɭqXV3*^[ɳn4, kH\TI}Qީ~2sh LjzSazUzSt޻zh^㨃ع\g^ _zNj??|Kα'Mg'xW>9,M!751B^aSWL moU j驐>/D|KH΅Z*$xpͫկ=Лs_ _&B|퇃r w=>)w,"$?~d*ל*"sq*!k\:%7 oDUe&%ZpT&o ,ZGp Xq6y>LGfLU 魵nkqw`ǹ0^;?l|ik=yEH a =?oY2 o˱ IDdBNB.,MQŖBO4[2*AFEf|@[i9SeJ\d " QΕk6!9f#59QEt 6М38Qϸ(dD"GpJF *dhJt◿91`}V,\Lc=*mfd0v2..>^Պ8^J캳>vq7Ϫ|Rg0.F˚(Ľ&Rp}!\+b*r?b`BA v;XHo'uV i1X>Ln( Dh,@(2O!gv4xksc!TG`瀸uGX_ϡ3fGҋ sZ~ggeLQm wWEpӯ ϲ˝Yț +0~:@^ox~-jjp՚Wm im8Z㵟_ -HRpۅ!/pop/k]^eiT1,_EGKҚJTK(r8'ZERss T?~'JyFtyDN 22\K7ל5I) bֲ’}2hq_opp&6fp6D5b[K+># a-:[]$8wo5Ce}GQ|m6!3.pX{Q9r/U]9 t[6vqC偷?^LgzH_)襻]RއcV{ƎcyJS$MRmk7xYJrdWe5I1J[D|ɷsYQլ0@=OIqg!3D%)~د&`t|YOŋڗ슖8|ABHgeT $[߭Oqq L=.qUvb۠ ޒ_:HS"\mcp8kb юpJ1NG:$ ~VtW+n*Qn1 IG|pE4-f?I]$悈^(E.myx7íl6k%.WL:V̮ׯ_M:d5WD3SR@ۦ!/Z?p8-r{ow0/sTmvyj:pVkK<8zת-JlFqV bPihTir)=7J Sm=RE.BtܞWw31%^y]ik,+[e֮ts(V~J@üA b]r.9%'R:910x$cm 3Efb^Bބm}oI = sUgڼK"H0n= xw$Dت9oH>D)zi"Ii0/a\eD^r@Zpli%% ,$ $w]C"qg(TQNG@?cbcR;>Z]ެsL(QmGsD Ȣ`;X+\$JK$`6s_֝hY =$Ht\Ykq(s4|~ƚ|= ch V} \em <a!M)%# JZSr1b{koanx1Ruh[P&iQiw͛WU̬G%ZYK0X`"k"QK)២+&9NxQ>$8?4[LWmgܓ CMៃr!NY~4͑!RDeP5ޖrT:4JL;{ ˞O*8m[ZF#*.tqit1(=)ej=%[w ˈӘ\RzHf-{vsz-9SʵhV+> OwD{qQ6ehQI <Batk aƃl {1:  @9Q*.m(zl@!jGcHXD < %PzIYm*]LK;ΪinumiM-&գ<8`$u~* 2tψ> CE=<""H;;G 3߉6^R@ȳ7䥙0)p>_ޮ|~3;bn]p~z~{zq߮Z/_T=BR+ .剬F3]!Ҁ/xPyÃ_}6JU ֗gד<_Zozx7j86I㩥|\śQj/+?n^%ih}[ygR:`R ch8yIuѦi.E^@bŗPgIN Hpӂâ5Lr%eck~i02Xs Cxξ$}!&g>̘Y3ZzA^Qv=YBi?Hu޷T;e%0ՓTk 񢫬Cz 8$*8ʝ7\ LŘ8P=!DY^1Tk%iκ%ZKqOqf9Ty럇ieM7gًkm;wOPO!%RԿτbbʴ'H:]*U餗:1twI]Ńx̘N;8.2$EHkΜycL#h!x0и0{-#DMFS.I n j3p5Vj|@r[l}YyҮד+wpdY,6.[LRuwz8Vu`n ]Ʈ1RʑI@h'59n_e]:nάkgֹ]v^y^8|[B07׾i<ϠBNWس|KMnc…Q_ͭ4E7W-az+}T E]5Ϣdk=MLr7PJ:?W9Y<?oK8qUY=SuH$ޚ}4VuArxty2aS $ JZSr1bTO nNl Ϳ-Ku囚J/oY :ѥ\lI<[kV=v_N/\ĭH9-V`':iq _+_JLubh׈`{i/o3$)5C'+=`>5Ut2Wi1X렑ևśaԕ.N$mg@W3%ƺC.f0?7v8P3k Vy,UAĕdÈd};H~Sa2ϙ5k3*6ˉf3nW|]2@=PՐu8Ca<8´]Ҷh`+U fP)/v&B% |h+bv"> DWo 9tYElnt.UilHA듺GW[ցn{gz_"Bɸ$ JB󠤉oo`HSnڿ[8--S]z(1=̷֚/Sz$QQr8Lq:AG2Qh)0qZ5(K)µDX/5^#2ndΎ'3:(gszʅ+3'=WjB%\z;p/^ug~Ͳ\\Uˁ5)ctȄX9"E\ C/! H/:<- t[ב HitL*"Q* J3K`i^2xAt["( 0h̪A0N N=:`b('U$0TXQ{-0"ZpQU"؎'Q]]CV`%ID DU  Eid/!θhdG23M]:!# \R.$eB{  e}j3:||HS>bhg;.;|´?Z$K5ҎH~B0$Icl.y)J0~:N{0[.j({u({Œ1.LҎ); t"r8BRP;4I8ayVÀP& -brtb\/T0!=.~LTY&f/UOJ43_S_%磣G%քRvau4ֿjhZ#lhtPR02tQO4vՋӳ"8[*| {gR=ۅ" iE fg u--1u͐Eby `X(4Iv'6{IU6r]k*I+ZI s ]b+J>ŰLvMA ueAp㋓^H?>W^~|GLǓ_O>}0lm䭻ǭ:PG0a5_'4jEsJ jU)P|M\xfctdsQ>?N(z]Ys#7+ =F8#:a<;Yc~v EIJM! Ev"X×?e%Ӊ!T' 2*92CNC̫֒UUEFm +4ioYrAГТU&fD)ڜ5s$+#c5q#n*XXmf Ue,T- WrgUrK.OL;2qo  >GlX)e%t&4c41zz<}A=3u2uC|Rؔڠ!ɤ1)32bWg?bA?.)fKQ[WFmݢv`wxse >Ӣ/>Q+d&ZtYGTG'M6UHC&fȁhP$,j"x2bV!)x_'J<_sAj㥈h+#iEĝ%\ JdLbVI&sҁWe)s ,^@r3 nH6\F" M,$472Yj FzWHeD&~D\;/,lf qQʸZ\lq!qGGX˳]"պ{(xM c~n,'%>W9`ncn@Y蠅kDg[[4fgTࠔd%JL,r`L8N(bU[N5T&vfB|ړmf#7jQ_O~4?UP!Yj 6<EԱU![U!ϵm50FiH3Oɸ8Z .giLkB J}c셋`bq] 84 $d$lQ1=x`)jksg?Dsg-ǻ8/\2) ;5> )h*W@@W/ESC7D'jTnAq5RЀwlMsD`ZV;°Kώ-8zI\^xu DJhsqlmb+[oZu,z~ Ƈ ;S3veoL;BLǮ;oS'OPoֿ>x7%P=m#y,/|R^ox,Ziq«6lSrum+1-% <._{ߧ8]@td'=agh3 W_Mޕ&~+g.]t;~9L`g.Z7lt/?xBW98LʥjĒi]L$Ld)M!|dyܲ8 _qÔ_]}7#'taBKB٧hggV*'v3E|7wXꝖ{yzi3O~80_)z?쏦=)%P̼ gwwh6v Qn?ۀ xnaPA7LY$Xm&HROɩt pc[~&|8 N*ב%aJ:9$Kp=aivT! A&)hxk-ٌj׵4{!b:GgS+GHA KD̖}qK0uɁnX-rFHRyΧȲRd\b5BE(0\;z8g~#q4%h{ҹQSSǤ|8lSϝ M|s yI{;e`Ol052@n` &hP`U2}^7Ĭ<0Db(H N ihg*:ZʷW 0G:f5԰) O\sBΊ,DzCِY 8GH@?E&Tػ9us|vԽ3j|81%Fy@alD24DP"}*JOM(QEcJRgb?s-8WLEy,m{l2. "T괎BD(W͈fxkD}}qχgĂ6o-y%#$?FaD.N{Ҟ#C<=Q`u)u*fE/_^ъGW-㼣~;ź6KWZz<Ά A{nr Nh-cv"\0't@RT QXcoF g>^"̮p7Vy%F0~ :Ϫ:#aݸÃtV$GCb "e6\ ̷H oJp8]_uSA}3q1F0eW L d}*GstERJX h+lk)_̌?_n؝|j] ϻemݧƦṾFwB > ν *Vbդ%&}@DJ^ BpD(sy݌C 'eb69 &֎h1$CL$1 <պRBK?"fl)R\'ː>|(| vPݷz"}p3 <-$ - $^d'G >r̡-p|[ =-5I :g d0dglBNpPFa3`F8y O{eXl!hp{eUodQS2 `y)#i?(Hd)JϢ^vW[sZp&~XWŖ5TQ82Ogښ]7R / x[h7ia\9̊yeUA$gyt݄X٪5K2&,Gs.in] `|PX.98BY6*-c.;28ffm~ޓg3 vH40wfў{9ޛ'ݰ0[jYE*iNCpGZދ|{nFmo4&v4E!=eǃc-=w%:RZ~R~\5~籢|km{\jb6GD1d;d6 `D Yde3q%-hyb8:sM5+*P|sHnLO F I"WF#Bf]rZk&{W)8ZЫzebЌ@4Nn۝nnK(, <_|1Yf׼z oո~;6Z;BqpUW6+B #?_>eo԰??]5OtӠ59m׮j{o}ksO`w׻ܦ7Y7! s5/1-bCsZ飫_.4&~-_ũ)'%44p: )d܃LWIi 6 ъR3"ZQ\@g{!Cc=p~lf0^L֍"L`71 j,Zhj2HJBL$2G7}ߍZ6H!ZƘpTb%$\$H|X5[֑iHZ2F$#QQ(. M>)%¡"FڧqyKM#i1!6E1g"nK=҄P^;msjal.o68 >1aD$9M^|0>8zm2S=%{1M ֻC&(HakH9:Us⥻9/Ⅹ&kIYGmNGl$F"c׸R j&n9'.q$g)Z燁 1l?;u?ěޕdJزoMW#K5sPpo9}C  k\+p Ó[U9~-ity9 8|1WY=×+^*qLyyWg8O{ߜ_ߝ|ۿ=y\u-0>{Y-/Gp};jug_*~M?|{>f>'!PT:dVGv8sBI}oB?9rHAID.ŠNN= ]hLl;U0lDZ~5ݞ*5ݩtkZWu6LcZ vl£/3Q2Pڕ#_;Vs:n;X󤕳LM/~΢{lg[uXj3G?.:2:F<]J 4&H2i:fJ@!$R0Q#1*qR"i? ʇ۞R3.@)E/L ~|01@m;dw*\ :,ŒAOhZrD' t,R -MQHeKp%H-?%NBOkB2N AR-lS02g=2R mPBa3'`3jssxQ_y Gl>P@hܨ5m@&b)hyH%.5O͐ T \*/I=Mb`"|LE.Ynsr_PeaԖjwoh.(Xt '}RxAM6ɠBH,ecl$!N ȐQ3,kB$^ȴ8lI ITGema<,&z(& 1܀8MSvf9s7#!hJT8 $( X8P7eTΞU|(Y"vC\M ^gQ%.2RM.DLzC{u<ؔ(' 3{dJow{sSjtl,2 fc8c;*3|&sHP*oi$ 듄 g\.ROr: *%d. F#Tb }2J8 B0T22[AjB$QKrX#0A\[#zA@WO!Wl59^Q PYF!˫|rHӜT x2{gh +yq2sށ3HN;kd4Jc,o5ELhR /I<)}JRL3xq&|gv~uW圢ac,k(#Ûux[c Xex]hQY4ABv*/Դ0Ȍƨs/%TB@}Psi$L:qGQ Txic\2 I VH ҇/YNp|>N۫cIM׶s=;t7O.*0t3 #3}u$7IeU1@Ð{ ~N\xU]y'Ύ_'ls#r?ga|)}ZVک 4R"yQK& Gr)= Υf_u:^ȫ袭A-Gg#gih7Sz0Az~j.՟~SlO\9O vn9όQX@O&;A/!ٻn3p4nÙWawmT^OאpnO"&#~՗`X\IP&s*9w Ym p/!X҃,HOmd-},uX=0ROmZ9ڳ"r}(pj*K)EW_!\1WY`WY\!;\e)5tp7  \@;Uwdk+09 !(.Pr(pe{WYJ\}p% \?>r('("59,:2z0[YZ`WYJё3\g r߿ޟx28t^;(?;vMwcwijĨheiQQ"2Р6-<˵w@B 1mi1jgF8rF85wdi˗& [iaA~9b) <1Tfy`NEY(W T mssW)5ȝ "锨ăCh _s0{S;ԺZndݩe}ieE|Z/>f7`WjJ)bY`u8g(YñEiٮ =g);|M?>+ddRʩ8B R 7ޕ57r#鿢ؗy(5yx֎gc"6 nI_")jIQvjė<˳mABU>lNl+G$aG^$pp VʺA̤* KUYM7O}ga 0!:9)NmЕYMgNJ$bB8zR/yrzm%u~nz~ u:8I;-k|6w]z}o'4 sQI;s6ԺKfzsɖNl9l; \޽MUϏw>WVvyfz(EE[IVsM'2%J+VI70Z懤oЏRcH6Z翿(X8 0BnZ3:kkZ Fi+&b EKA|V 0ȂN$3iQq%3@m_O'ډAښtqt(kZ;H"/Y`c*# \zS>MqY (3U(B՝3ÞEE7Rɷf3e^3ݠ,']IP㹻~SLwDX,h@⇮ U~3ZJ#l&͹B)sV't֗K *++qDp>0s5c?yw%>VS$y46bn@YXp1:h$@T_+I 6y12 =S%p 6#evBIFˀ22\܍zbjs˺ZN&}az15] R':MJɹT_ ޯ*GU&E;9}mr*z]escPQKOL#9tsZ qe^-MdugY^k',efيbWtMno~{q^ fWV5g%L u_~W!{l2(PL2BZA9Ҍ|Χd\{bˣ]Wo:^J߇.wb ^$G"WD? N& 4#>D4wq $h>x(x"P6GSj$Q2G0V&Irē'FHZ=,&\~l 4J] aϬ+̺ d%o&a&+D&HfNs&\eX.Fz? O>bO:va$L^JzLڔ> X1x4ދc?G*3RFkwJ6H>Ћ4.%m2$eFvH}&5CÜ2X߁s:5l侓[?~3R~_ZƸl~CZ6Wl"r)q"Iryn#}iTarcw3}mrY1Icθt3m kwҨ-awq1ᅿηdKY4ﯕD}hp'dS.NhEsL=,m~⚎4~feD̯+k5/fF5soٚHppu=[vPD6~F_{$PGFtm0 1"m+h.v4㘃tUͣ.5jZt"gRn4҂GX"_ٕWr̕|Chg FqpwMw?ۏ?|?w;.컷}SOKPN̋M@LૡCfGJjA|[׋?}?¿ &v~{{sW+r(g3Q^WhD{7,˃Ewѵ!jmbCޅ65Ŝ vT]rǸ:.0nVX.6#m=8:F%np9EIŬ=X#푠#tfY3!1ePEB[|kv^QnqXHe/dupsf:ioӷ^젷WQ.j3DJ-6)I\fEô96;U.v&izB {~M؎73$Ƚy/=*w/=Ytq_Zz0 D [jDWS f+>:QeGvRNԖ/N@C!>`0$ q>C{ͥ~K+jgK$c`~S_^OKLR$I&/YAd%<_2h0f@i.2jf(W\dVsM1ьJZ \fceF6팚miѤ?vfjt *yJ+bx+~M*1s"!t"u9HTy/ʘb20)%*5)!`1K.HX f6g&8WsaʒtKzX,FBUY^TTdQiapbϓ8O~g ?~~d >%H4% h%bhB"N)z-9tɓL],+@P=ѬElJmph2v̺,HK3ˌ\YbWvnGbԮFJm]Yj^jv;^ƕ>ӗ*-W:Ѡģ-غ`Ґ r -:dH`bQ$lSg%BINNYm:a/*x.}+mehzKčV%` Jl+J %";,%`!uYΘ6@qR pD.r&6tdc,`7b ǬtKď׉^u 8iuNjT\\t\FB`%#BGe1[́D 'cTAia!Ԣ}!+h>{ ȁy2M7S~͙[7YbNpjNP͔*MOD2h0.FJN7cnJk1 {@~%Loފ8 n-pzOP^}8zˇmN`V^ʛ6_~Vʤؓn,ӶeMIvhT_E`k{snyT])F)Yk r)Hpyld P[Ut@:"I!Ǣ;4F e@I>37+HtH&o(!􋜙?YXXèE2Won│#1#u:%orT! jÑ9Hs!IJjV3o杣o|5m^&[> UYA+|7'o,,zyAlhy(u55ƫ-)8TMРd$+8UpK+s*c* PnHmX`D1EVxƢʌ{Pe@F#J.P6בx NyQXǛ|$J";46kDT y&9wBb Xmd 0uXJq!Q:eB.PQwژE CMfjJsD{v.i› Qtw5~u%`}jB#:XŻY~v|zůrhCXcM Omtx=9PfŃ~'JXpZon|+-3 Qk602, |h3N "y+H#gٕq'X洴xZ? ᇘ?Al&'tBkى T%JJ|0 !5t^rr>s :|>ѺRmlRaU$;yIMXxt8mI֚Ӕx,ݙ(i!{zsa,-g)ݴNIqR&he6\?h%w<VoɎGM^i hbN#3J Y0e07՜x:϶9:2zA >g\\ )hoLώM_Aʍtt ]YpM)k;9XHADn2L9%dz䩂<0*zNGqq^+A#ǵ_!%m,ʨ311r@'p!i|G1N~W[L}XשEǚ=  Ie7Će d%Tn=/$|J7҆,"\DEЍ*M$سI빵6)fl`|XѪU98DkG@flW]#]5 "2 " #&YOFwl6stwvmgmV Z~S3l[m#nP(yU}q%hotdݩKWHsjC~o&ΩBfXd{}vYJ#'4mgi2r%%lOٹ~{皕6*P9}p<~&RRi!P2$* @MV N$x^,ͿܝMAZBѢrA#VgU\oP*{W ?s쪙|U <'sN8JhMxgyQ\4Vג~TFR#D* " kV; ^HaJaFS]QD&kFrKsm&8&"cSסY-2 'ES 5{"jř>ɻp9F]\L".K"WEJzzp|8*.h3탙G?+7.3.#Ԃb6r  m0Z Ȩҏ,ƌriu?` _Vj_u7?afZl (8o;$Oˣ'! Z}kKZ zhUJW s##A&%^zKJl{1 TB?5f@Yh)Bx*b]̤ܻͬg=r:)[qt2jdd{ H0;,'-E`}9&vK1MlPMwhb\ +XK-*₼**qpUԺwWIn WE`k.P\.漵H+URpJ  B5WE\.ϮpJ i BvUW=\)9=•6<\iMLW9sYQKeP~7~HɧGzn_-N8$`헿yc!E?q;+RŃ~Q&jIr*?2sΘS47V@e1`>wNXࣱce?omTDlT3wQ9Hz^t%EiµYi?4xivZvF>R/>uKLgGûyQqhcMJt"AQOЌ_ ԃ)1H%R/MԋQijd$RF8K%(*d, ΤX1{C;TI찉BD(Q#|6٧> X`O}^p8N} 8iӞN՜vJ=p|o_۸]?{RlT;3fTy.}\GϺ[ 37J(/xaYkڷn^kM_C5 P\J-\ƩFX鼦 \)JAEo3oG W3נӧ2nAf׶pބy%F= rC5`""sw+R>=:ɨP?c[[ijuxS-}OKAGe) KTJ9̑,1eAuOOWєm *O^ ` CUj*Ϭ|{d+UQ `}u<2DžD@cHp BȁE@>{<& ==5IV*g!y I6)YB "r!eʉ/!#O)t݅Q2j=mWF_Z 9}CJXQgbb.OhC460+FSʆڪX?EgźN-:q(dhPO"E-/ 6,!+ ^pqx&Sq?6H`|d4ݲi2Ps|LEyd:T9nxEEJZܸq-|c ~Mi?ym5P̞A=wqHD{{_6MF}>p}vNoE"*{~>⤶%!Auq4}+WsEwu%0L7NRISfqj/wuA΅:-GQۜI p7~mvCmxbxHpLpcVއzOg&x '-̚.9rwhi7lxX5L~K[-,MF>oMTٚ+~{4Γ*9}p<|ybdأ % CI d/h^6;Ɂ;*W40k2K/wG_jKމ]zs{oO\[=VݸJKsquYrlJ%>+e<@li8]58 Ϸ DBy G1VDm6p*KZHrh]^%!Ad-܊?; ^HaJaFS]E&kFsm/MGqz9MTEv䣧D1bt]q!I0&_!) pB2ލjԍQK''$Fb;,'{zL~rꒅ.q %*,ƒ\Bۓ"W]x뾎Boc'R2(6`4)nx0v ?f`8 n˻~17O;YnӜXz:ۨ=%W=V^kIwJ{JJhd sZҩ#HH9:~R8 t$@x-R,}u2=zS.]U}KY.)Қ I 9, o#TB2E4`)SRXkȱYGN{;#U;RS 1׏ŏCr]?*; SסzrTUgƾLhO5,1,tg.2/Q=t\ŋ lL8OLPZ !ںy|tJ.aߙja6 7ޢLc:3depVz=Wgs`NӶV8ϧߛ\ rHB3[x͖˽v-¨6iOeװ}} n!f!Ny0t"ߙ?vG-Jkh]1|wlW5z~h\#uJ0Lv?th>G iү-϶="Nl4b{{CǭZnmJ9`K_j'Jt{kjV ' PJ,/h+b2Q[~ֻ=!@ZSoRFu^%pTQad H")* YY]tyl<]f1;#P)sbYtI,gqRH$ѿu1̐9ս([Iե[ m+ٮh[߅pjEpW~}ꔰHH͉KFee2edqN=G͏œ\Ng>w$/y(c1@>lRWT Aҫ;sN;B&Vߣr4Ben^M {=?JϲWv|$@ڄЋÎжbqUeޱiY}}ޕuնS&Q A V*:k:GCǽW̆=k) |E{oP'ӥ[4FcZ%50pьi\BHJ'~,{zl :D9V9zPHMP&n :t'hG\6Z(Jjj*lwmX2@Ȧ{&4:=/|Ȓ#Id2eKIu:JKyx{ѵE\S&)DXJ zb 1I.g$|98ءbyUzN̰濣K.+S!0}8{d2 qj1HakH59>WsGWDZkZh{VQUHdL\JAM ^%^{Z oC Jɇx۹5[ [j$:==]]~TQ8'X7ǎFwa/11 ~88s}gN;y~o/]?x3-0ejc^^_4m],aWz3\޴q YV,/ ⒮Ö#bf;;!5Fa 6>6{ICʼKKeduJ.+ui]7w3[q2Fr}3DJ"~Bi<1|(0?]y}/7?)ٿ~ޝQϛwo_lQ9r &$%- xP}דju駀_Uw~~v;[oIzMS(Gv=! tQm?~IJRA& 5l.*.`rU*/)oEOW>v@ڸY#=%G 7CO҂6kA;$Qo !&h$P+ 3!ѵPZT|a.U=|Ӫ\: و9kKx rΑCBNU$"WDZb^vcΞI-H-䱣 J-=o|n]U:kZWg^ 7\܉rc5Gg}P 6bŻ/:j([^eӷv&aSd_Up4gve?H>y+x3{w߾㓬FVywN_=pbS a_M / QD~yi8r Yb|Xe\*S6Fdtb۟l)#huc, 'K"!+ JKnSPʓ\{$" ?3,=4bt{X8ӟ*r{y\J kIcp1}WQ $㔈ej [THG4 x#Z\k҉"Y?PyQY4{iaX?VDxrY"KDCM r;ԃ^Jwr> A+BRr%)s$'H 'zP!ǂAV )V )X̜XW)zƶX( chpX8qt=% o~.ovGl>k KL 4EnTp^i6 1BҼCΉ_"RKTY_k!=Ag6T^3A%vDĐ% 2AKk~2g=b~/P}Abc[ԖQ[zz\\Hc&9$H42$ #V֣DT;-@ϐhϣB x"^qb]H":*k a1saM'i20 "".EW!o˱y&2{sn1BӔ00✹&D2" 785I*ʃ28M4D%U{Q kPٛEsF ;|rcHT xe H/ V$7bWe:ofU#ٔHN;kd4Jc,o5ELhR /A<)R̜#x uel^;Պ"9E0YLYP\5ɑ'O=KX+ Т4X-l)侪%A!=-' 2#뼱1*<'M+ϨhC>(Ϲefe&PyGѻ TxiNc\42 I VH [3Yqp9iC}Ͻ93Wǐ%]=;tZMpc{sbZU +/=y)Ҟ/dYoyU_a%tL"U`T ʢ6T.` +6iGKK-Hom#-Sܦ^OZ .G kJ=!&̓;[1Y!'-u- jɓ)Gr$CtC.v<<7~!Pq3mley7}?rɶ/J=..|g{9f ^ܹinwPs?yoKm\-l[O<-CLҞK9Yii+l$#nWH]Ar*>)W4].|5պ߳/=fp Q) $2T$EQ& Vk니#&K~.'Sp;( FW/_x(}LV( `U`R@ De)ujZ`X$B ho;޾P KRk޲f]Bh-b7% lo^< V gM8k %@t&M^$@G$@6>'ԇWhW\WZ WJW +4u0peP*[ {]e+i;+u԰gJA" \ek9wVp.; r0pJB;\e+9i;+!s,lW\MZktVҖ ~p%`;= y0pr^i{:\e+gv[~JI!-g]es9z.x:\@Z껄+{y~%u`%^ >Aa>69ԫr69qXc*\ΨF˘h**iz1Aۍr=bfVWDn{~_8e3=yX8&G3fk+?ڑ+\AFiqID&ﵠ9,g(Ra2DO/ʇ`sMxpԩ~U GhWG[4ֆk qX !Aj4T&(AsUbaw"2J@fVĬD&@o#jKjᆩHƥK@AsїLmƪ Y̆FŤ#t 8qƚ FHy^Bx) <1T9y`NETT&!Y6JPY@JM$TFLqLD$4Bh-~ֶo㚄>wv# {X֗㼤|/޸f f{!WS:WL9;k}dJ)NyN;^)GTΛJӤ+[ ̶>\GVHaph{",`"xHV F&Mߢb&!18a1RHF%%YíN*d-Pj,I-_*4x}UkW Gg]Nȕ s"QZzxfo4y|wWxw]|wܽ؜ᚇNYc{Vf@;'?vU>x>3wYj&WW:o!ss5*Kȴ;Z8SZ.aMלTrzfaIꪑyάR /Z5Q[_!s%gqƧJ!VRpu w-OHe(qUee&"%-{o{UMLH͂ryzYܶe] &y|ie-󘱵d%gd(|%"4:|)Gާ,(B) (5 U7'ySt(E- 6pgՎseLCKS0$@a߼9i?_3+ْK_Ԋ-߄^,\l mq+#/y-R78N vVd/0G$FqhE"5! à\2H`sf̡Ku`J*4N)d܃LWIi 6 ъR#"ZQ\@g{accݡdx&~&3u]o9W{mG03Lfa,c%Hp[m9z82m&XjIǯi1&T ,F'PAra'_8z-sg,;cdO·Эښn"H BGHzdt4C,KH6r>RYօlhi@pZ*asbt> S/=罓|hd_.~?E/OO_^H=l9;EM.ռgWS+h %eϱzK47\ӣ_qAll+"ZjßOmث0_*ֵַꖎ5#73˺- b(2e' 狁\?O7k[ls\w_BΤuBFJÚ_Gc(>~^?7|cP%Z=?]A䈖~ß^|~z{ȅ;|߯߾v?LJ.[w[u@[tk~˕jY}vV|cG$IU&P0dzi9YGj b@S7 ck|ܠ][״qj!؍ TN%&GQGMOFQҐMP`ʄDԂ;s,sLȄ+}0)+re0NeS[Ҵu~.L=Aծ;)8dIӔ!+FJ!UR3&.w;Uv&8==]{ Ē@cw팉)Ǯ;ty֯Kaz&#wPm,=ʤA7* Id!URF |߂n؟R?&R> `V$P6ĉ40 'Ɠ% 9Y tK.eFk {\ߝU?>x(I6(Nš2S*(B3`b*mђnSUPQ1E9$ {@k^UNRZRK7Qh-sPYPugR-%|->}.z5VrkoK[ WaMTnEfyFb(c4L%*H0F[ U 0*9tBmH%kUd$$-J6hC Q\ 8pڦHYpVkrX;ۑWVƮX+cNp)05ѥ7L59t0Y?0>Gl&hBm`#2D&'5sFeY0e/$WBX0*fYɤ/3reĮ֝GtEԮ:vEmSMݴ=/'5VGC&/nh`lr\ $C0I366@t+ҐrEZ4H5 !N HFBe<֝x*l]UFD!brB0'Ho+.>1 Y)L@ WAȽ׎,Xzy[Bw1cƨF** D&NAíDP7rѢ2"VvD|4ՑpqOjdG\2.;\h&R0]bcy%S98 /tF`s:\ \ܛZcWW8RnvqƚhVJ*;TBW*ao *[9HEURh%d)W֖^GńJ%th2 'ټ߾vܡe{Zy ӣ=7J^_9NϐD+aQBL @FxQ&:C7R8(U%<9AϹ>R^Cp Z#Y,-.u.*8$ RP$9w%&X'v< u2IU3A`̵Okug;iCϵ9W.oVS)>^wZ}qjr]&0-{]"1_} :_]LJC>L/>[%͗E:\2e}f26*R!e 9Pv v@7Tb(}wi [w=xەNId$9pS19<2NX;R! 3mBk&_rҮrOGػYzݙ oEXvݻ)oV&F[8D&1ﳉoQ2Lhu9YE,W)^O8WP+,H~Η+B1`gFAGL>2g,O+hز,dFNx=_۴*dzvt]RoVNy 3Lu& '3wRfP~:TvN\x^ãr5y݄]碅i1CN:LN*mIYʟ$eq!JVY[stBJ@e*4:@Sĝ/5hyOs 4*kbBi}v6􇵡G欄ĵ?c: rbÔ$ʐU$IIZ<% ֩+O JaXth=jƶ='N06wේ>-m;v@vwώ'Ӳ|&Mи'V>@A7pN6*E2XTDD(oJd>YMN< TNstҜԕ,Ax֧EYX>QK}/LheLg>&4G.[@,h&Q>//fAq_>Qu9)X1 bb14qA!,DqY޵BwrW}\.rU![KfU Ψ5q.FL6c@&TQhू^h!GLZI0)z.Xǫ/=_Aw^7&B۩0”B϶GgJp;G¦iQzg07JxY6%-Hrx4uX>*=MfAaІ>1YBJ:'rReOY]~pyr>wϾA?nmΠuߦr}l{_ycЀpÐF]"b5!! 7q޹R:6! BY0tp@p6e,'f(Ld2(ApN࡝1]r8aLC@Y\\'˔MvQޗǭ~ZtÙ6'Db#gAPE`ӑaH"IA1x*PM֞yЩ:3OR=kC.4RbUV%9>f*!F3gcubM6)Hҹ#o;I{ȚjԟҤw4;>x홗Fj(fQnk *b,JhJFF^_Fdz<X+u&RF7e_&IHIn&ڊ܂S=4^0ø^w,z6z6ʉM5$̆lQ- ka|Rs:rY:}6T8kRb}!Y[U&NnΦJhZ˗leNؘ=1T<1KnmoxCեOo(z{S-ޚbfj$T־yPTE+,I=w=XPmt'",Fx_ox摾kR4#tq=F oTECf;B) QJ @"W{PXPgf#_%NKQ-\sz߽ޠ7@OɀRAYd:̥px!$ݿ1 pˆ!j:k t6/at$5{X:BF;!/:=-98&drc^Q/S%u/^:Ҕ]wpqyx_WH堧/\?VީȻt)>?qbr;u =E0uQr&CvػuVnK̜Q^TNWojUxP:ƒ$(|m0%»u+.\<߫Nu˛, ĆJA#L@Mf5|g0@W?~U'cZ] w0#i.Qjz(`s.t(s,GGy.30KOL[Y8p ݡ̛buXҮj zh"<$RwW<y,![nuA+:j5(3;Y.ҏNd7,R+RYEy3;˜QPqxvWUB˯Ʀ_o0Fj'TxJSYM ]˗?].v6;-çSdxt],d'JVLϙlRe`|rXVN/zxBSMB5 WbZ4iJq\@mcHL,c0AJGM,+&9NxQyI.|: :}Ķ"Ulh@:hufc]tN9SaJyz*%Tj5v{ `'YĥŠ CE=yDDvw ǓFեBJ!b$K0.Hg*Dl4Yp^v O׍yNuqq^ ]ZdCJּWJId(&ۚm[[X{>XQLr]aWPMd땞5!;ZEQuaJ@.i+6QTs5Y1LJ~SO.8}XX|XSPŭTQ u=bTzED(Զ%=Et(`L2}$rD*]xkf}|JJތ]/۟_LkJ -};^~f9}siPbWJ}Q g%,C'L fNd?L#YȢ }UƇh-OٌX(CY ѡ?aWV\rޏ&O72Cޤ/IhF i״'_hot6޳uD3JZ4@{\Q}|+u(6(\.sN'h O̒r7Au{:>>PoY.<{}4ł+|Ӭv + @E)x: B2$*r@}ɯE 2`3ePZE]6@;N^!HDϼ6QZgyR Ta&AXk "\j.T+QhWcB)7+x92J#"( -J))$m5vKB~PԵ GE?@Ǘ^ Ӻ TD1! 1Of2_>ȹ%t9緓؂IၱkT V0bIkCg̎nY")Iǜ-X FAd̂(L\}p+bP [c!fGjd>9HysQ b-X{)  kһA6riOLbr{e߆N.qz4kp7~4z%׹E̳|ϣT߲n6n檁"@-lK'INz!Jcjʙ$WI,n8.=+iɐpU|*愾J3Ň]+#Ǐ1Lq}~e_-v* ǥs 2-PJp^H~̰:+%vfR?JaԂ <(Ex"? Un . `e-YnQbqz֮ T~]ei̠mo0`:Pӷ٨| ߙ<E7PAՏghТ,dPvZg|y6٘=X.z/5f[~jP(z)W2:w78Ŝ9s˭2N|*`ĺ\e&]Zh`k.PH Ș8P=!HbaJ;HqƽyOzTBi@]W?&>_z.imJ;|f eR.U_6%5O{KdάӹQNzɭ3pQdtNF-1KCJo1'oh(E ל9۵!#QHYu2L-Tn&ye QbhnDl;;栬5&bT2GBFm;*/b:j 1I@pK*X˪յf䨷-]xwѤi{..lɍUy w$Ah{P؀.&osiPT:JRn(g$־]${Pn#Ba<n7}=]=w!mK˳&7~8XvbpI65͟L|k`z߿[ݖ=]祹KF/jnnB9J u%*~>A%R`߫6BPCC(ٜ䑐W$;T{\(E9 XdU0sOȊ b Ucc( $T;aREcD2Y+ꥦwmXef%rmy:T*| E$e[$El ; 6nsp{bN0p5@s콘9;w\㫗cAC4Kjw zj/udE>p*OK!,#DJ! ˲AmPjOP PCn3/Yfq'⮒ݣ|6KǫV&T> 'EԬr`T]Bєjp j;;>:,]DZpkbFDgo6ޥoZ [Ĩ''Ӌ˕\sϡsp(T \,<x1|¹ XagB[yY^S =џnz8-[kedslY57.gC0#bqȷr+FM,?7n>fasg?~?߿~͏?Pf>޿{_)VM^puROkI5ꇅï*~fOB]UDܸrj^-ǾwA4.S$s`>y肒7+ǾtUnt54{>pn D@7h6Q&՗?,냲$T@+g>q,U Dw;dbO#VJ 59E'|zN-!ZHEMYuGAe9Edd8]4Z$kqɍTyĄꎥ>Pt\ ,E|+!bdw ܛTڞriky9,長4" W!x4ǃRRANDC 䶨gTbLޓ Z!!J2t=L0Rd=II8%ZQ*KyHabX¾PvA}J6UF-K;&\OLyx}7*7> …Ƴ[b]pD, HJR4k 9'~AIX6C&{.2V1βJ%Q3uBnGM Y*!_&hriYRl~4KPvޱVv`7xKsbdOHah;ōRy ,hI`*H!Nѥ V{E!H/NKCŃCZh~VP^Y_TMݏOQkٜ?*х3􎳬SǷ>+}sm751UB^aSWL 6.x{`Ä j驐>sɜ&:":ZTI#d{=z1?͌뇭f(׵ЪP5evʀ\䧏njyͩ"8<W ,AY.ȒĮLf#ٔLxgXJ:28l@}2j͘ ;T)1blkq Y<,^\{P l|"fi,OQfY62>Mp՟??,UUPpK#+GpY:QȎBvP嫒Q B"-/-2#uzORZ=* "u\䘍Mވ(He漪zeD##=R5SV2QT:4Sʜt_\ <2N,]Lc@7mQƎ?]N1`* Uˌ1SC3eЪ@87$_xU|E^?y&6E?9^[n_F8-p?sygz0J;⸨-Selj Rg,bϿ:i᫿@~P y= u#tVt8m(iy,'Gk y 3ow\'_os̮Z_jHwWq'XA/?Ftz7G=썽xax[^_,NsH!09[DOaZ}^ m8.KE,R L5kݠĴK+YZjmGZ|kivV:纟dv]$9Ɂ4ᣱ&!'wv4xk B1Vz#0o%wMMrnbI z7 2A(C|e(zzQml woSW߼χÖMyup7wolo¼d qy _Kyo-4pDpɶ#%VCZ+/xէ׏؂$G]L%s,o`eݸZZ+L+~K8sQTELg>)9 T&y-,OLNx=a\|ou9RMǗ?*1\%%J8'Fi^9SIƈ#Q1*-3xglݜ=MPђ hȖ0\ qfŮЛ57z8~c]ZI~] gǟ8u7s9}ݬi_wR |;TSkgJm]T<w乸lJu;tWLgzR*AT XEF*iJ X%8j3RT$䬑[E~![7F_A|(wy̢-Si`4 VRFԃA*tpj_>4KYTHAH[suZS"kCT@ED *3}k0zQ X~dGz8ˁZZːs(eBL%QD1"h%4X̸w= y O]|aҶ%!yWZZ׫뇂fͶv;4PF[lf?ksYGk9dn;\Wo}`-LS3T] leh2UQKuKuӥ9lY&E*BFĹc lT EHH\"D h|iUݥGWgr̦-»ߺMCl]u2o||s39B3OH%tϥcs'[n(y?7f}C,YaɹZԼw^*?䄧_ݼs̻dvW[n\P:t;fr[:DܕK˫tG`WNq+}NP YJY F(yꃱ8bN9`}E[o<L9G]VJ~d)CA,3H!oͶr}bJ 9Qj(6'. ^M\hl8AxgAE`j=2]%Qّ`/@^Z͖CS+|sRMpvy*6dײ '-wɜp?{WƑC(}0^y6^q hSBRV U /CRH25uUwJԄDbE .,F#~d" ǼWԆ-2* xanY}E#b`Z aZ‰dMsR OPۭ X2TFC(g&a݅u$҂3_3":4x@hW6'ܵ6) )߱ga0tTK.1ƤE[O,FϹ ¡&V+3z⃣-BVBB׈=>@1sf;.{By`_jaGk.o632kмea\;{0ϳ[.;SB%yo ]G'}q˘_ !Ac oNF ;$GgZqNy8r \d xq۫5Y)XO] bZL\JZ%'N]i"N3~"C.w?MqJ g g췺NNK=jCñS+rGX"6_p8%Ƨ U+T.y-~.o^.+gk.A켚[nsQ6jZƬꀙ%wq]_MRwď-FYN:U:E_#;ߝzWzޝRfN_/p B o$A?DW 򇆡߬zIs/*~{1'!ݼ]\(H",* cg_Aʑ.@uM"H[0nB6E;[}q=.5|>n%!S_Ӹ #lclJWc(ɐƨ@rU ڙ!GT2BP)#J8ХW`kaz ަ/=؉_fglP4$"mH-l"M &u;L`zu@mLs͆xbeՀ|b[ibqy^=9lũr_ӭ~Mk·0IEi57j@wUI\t#|PGejW|YĵT%]g tEKA7Σ^HB;)* Dm#Q]R4c7% $ \RQJ)1&HRiT!hE& 3A8$Q ڹ5raiϲ=d`<Dl?ED2";D@ai h_9`yneXpt5Vcǟ.RW)U^WN_< Tٺ:{)Ά8PV.U ̐`-ZFa6˵W]L\̥`h-J2KղڧVVMqY:zX *i+aCi GcBrTG6<QJOu}gPym/s(3x`׀X\ðb6^tJ9*o$(߽7sq86xdv/w/v6l/Ƭ}]ۜ5UkJj\l;MsiZ1״WPOdV8-[?Ac~)[M ʟϟUEs~+U©b4yG'Ƿ9F$O8o|0j#OLxF/\)‘< .S񲔳rX!gw>T8Tճoj/j ^k#g= ﭓкNܶZWUv ڴo ̵x9m#skXZ{K !yҴrZBo-qpZ9Ϣt=ؗ4훶V'K1KN*'RCb)Zh8.1ET=c5 tn~:+&s{m闒JUOg/(u:'Ah2*q0w w%Ru-F;DxDȼ\q 0|`@V!P!)@Y%)@+!9AAP!RFZ+iC}Ӱws5p]Tsm@n}o%v`q ^=l ~n "@cNᥒ;ixFy젧z:ڐZR.y嵁dO2I4x)yZA0|v広;q㋗z8 e?m,g"P)4 9Y rk#h'Ibc^ЗºZ4GCVu/mˮ ߃Ȗ6@< pnx7Aӯ}J(%eIp6{Q,xPIeh4 BNyNmC$gODzhqE^% o 7ʬiGwN͚OGb On2\qTaGVSmvTQ P/ O(!"uB*[zj.KSѠnn#vGu0b?oBA\uk ԯֻCaVdM]4%M}W!f[UYJ[jvFiw^tSt1/Ʌh|SIOG\.Unɝh< MsSkGߏ vD^<[=sJV9Ȼj/J&+xtQ|49w6:.`<9^zTd0J(q7k^e{ JZ5|!Қ>^%t}JѼVjtbnqUl]_5݉ ޼5i궖Yw$wo'Xz|isE{pٷtOjySfB&+YIoŝR|iyI@V@5Qˮ=p켇ڷAw^( ib3]RypFAg0rѻj8olɞ(G%}.3AZg!h4>;هzM*KWJQ f-,;S=Q/:{Rgqy`G]6D`[[|yu2?oՄ RT>ɔL5])rysg ^}5?-'5A}s{;_ho凴7u$s @wSػ'pab&h?2 rȥ{fɖ>W<ޓ`cLPIKΪT5LPj;{o)5 ݄1ԤbS՞{wa^n=`Žy}LD[[qUgD9(WG#%̫K$K:Bgj{۸W[çyZpiq >Jdɱ&W-ZI([r6mۻ"go8p), 6L$ux`GF;2ZD_߁?sm[3,\JZ*DPңhdͲ$+0ˢ3FE5냩L^J7;Z2q۪#_p2gcꄖ*ㅨ2Q)Nq!#RejX hYEchK*l0:y-AF[Ψ-gt匔҂ Q 3\SBce.ҔjOI,s!(\؜IB81 L$xZĜS)Om1q֣[՜~%j][E?2sOny=bjPh| 0FئUSOQ g+!.FD)"eEHZWDT.j[9 S֢ϖne馽D+BDb ATp s5s!C 8*. !z4lj8/W !79858_`")׫|Ts.-:z!t|{=3l11$ JXFVLk^%rֱ$'L vD |nH$ ELP:I+"ЄRj2APc4FeBr,:uZBvV> &B(h"B>ZqTQsuASlo%`dR[j+Irm{C^ D(^=?: {!$OK&V:-Rt/ݼ_(h8HKU60)yH9&sUXUֈC7W(&5WߡD\Լuu kZ۹7ìxM=]{HGyxڔ4YfF8˟X?)Ż~[ RBI^|u3mfhҡn06I;({WXh`Adiod)na `\oڋ`83/2z2'BW" %wJ,2$TF ̊hX~Nt?\9b.sӫ>,2={Z H) 5>~Qk ,sKivF& pN(J <S])VBS9 B,T-zQ]}cY6JP %&154rgK%* *7F+BvY Wsz4c__ONKz5wPdfQ+| ^ B{6IS}aJ)N9*刪ySitUPyõ-9偒Sj6P7tZ262D 0|0eT6bۉi&r;gPHY"!("Jj[#T40k Hz3p9VE7wBsχӷPb )4.#U@xZf5,k›7=mnqHזZrhzYf Gcb_2[]5lZ*].XlXڱmv>j-ܹZnM/]ANCZss'a<|vr՜>ՍmR|@C\R<,-ِ[뛿o.)goppr䱺~JJtrTWLvMD$J:[[@w)fA9¼wx#NG@` D!%\AD~HQ^X,H{|{,KcɅjgeWUĂdE @Kڒ k#¦Hc qbq$~ޣ03%NP!Mfn2G՚N3ݮj]_wUtي>8hW?Jn E1V lԲFDlBԽyqDm۟%;[ԮLp؇'yѼyj.< }IMitc>hhߧ,(B) x*͉t WQ=3FiYm\D"Fӥ)06/&Ά $;:͒t}Kk]/^4~7 -bE*_Z̵hoqorY#$FqjE"5! à\o$9I_3`3w K)θ i 6 ъR3"ZQA\pOB^~ ϒ@5>IU D &!XUtEA+CL0fPD;d$IG#kE?\K&)DXJ zb 1I.g$|99!rAY˼ɪv)mރv~yxez{J8B-Oc.Yp&0Ӌ )(>!a'BI)'v8Www2:=SMMՒD YG'$:1ykrq)u/nIE?3rɏ:ӏ_띺smkC2%wӟf j}{:YQ8'*NFwa=1ČG_nq%$ޜ?@Zs@}=͈?u_vt9,8[" Wan?alg6P܎$Kۿ^Bz@,骩͘/ii7'bAmp(|Mzg;ZpJ4^k*nL0:tϿ75WSzB;q,FJԉƍu߽Du˯?8?~x~)g8;{\u#0>gY6v; 0RG]없Z&^Vr:! y0u36:aujL#vRj@p5l.*.&jA#7~~UMF hy #:cvሁ GI$-h-=iB LD% x* “@1(HΤD@i;SRCu0 '{m VG2Q6^R[g8GDP`+"K1hS)^͆b:1@6j9b;Fba8 r㹦皖 ˜aҚgn D@۫4('sY2O/ G0P!TU%2EhnϤF-醱IJI7!@PQ>&0@"h + JKnSPʓ\{$"[ʌҔ &3=+|BO9!'2-4o)%PC Fp1WQ $6E'j 1.*q$HrSQ"@_ 5PDpJ Ob⬟"n"z^}9l-)G[tJ9,iFhPRxAM6ɠBH,eI3zR@/քHdi/8|XģSa1qک=dx(~lkuaZZĕ(hsfdID`pV#4M9byMє}(\ݚ򠀥lgb QFI [E-b1q[į_C82^g1-.2R..vqN&*g bޫ#d5F9I '$SVzCvqv`b?c[{Hm{0a+EUь:מ~|ӏF)`_u>g9.~R_`_ SyJI'=r[9 OF GAJ\%3'M$8j0T.kSB& {+Z{11_t뇭& Fh]و<(=gNSJ8ϸAsb4KÅ^QUA\f FoJ$52J1hښ"uI{ST)ƄJg ^^})B">ٻ߸q$%,i X<n.vv_ݞ`+J[~GEǯbU=.(-$nsV CoB&doBkEL uq|Fĉۻ ǐs Om=6\_>cuS5'⫠_m]z>ˤUQǓ{ ^ga UNߞԩjr'vU¦2 )G$>L\!}?aҪk.9p!&9i e0 Z]q{`knr6GŒf<-a8SP|>b2bC(֟FP4Iں}?ZVewl5۾G:N:v[ Z1+~k1L-:"ƴ:֫_]d+&v%Q[O F"[+ڦxYK8{8r D4#*AeR:9xԊcY}N#&NǕQK,?H_ז#U7 _<~OG *B![p1GTQY'v)o J%ARR~v3xB\72TDZ-gl]3o޾y*p}0Y950uŴh(-ڈ)z+}GK)-Lc!s%E.t1d hV>+ =-m9!2&g@N%eL Os)h})KTͮu%~]! a٥hZ5-c3 0 zUN>zDSpPp% Юʝg|),uᰠXKΉ(]Զt ET h>5]C& (D)rtRNBћ%<xbqnH/a-P;*`5 [dOz>|HvFrس'bGAώ>]Hw@{#K5+%tђ$IO7=%z%EcS"h(dIZA `)8AD!*9FGeBr4Я:u괉BD"v~x9Eͼ|cMϟ]։-'bA'<0z&,ȗO^vu9pI[J])'VΖx>!vnDvqe}(q{PϽ]B$vpDZrl^ZKώ }ҭ3jzvqiAh#<*% PGNlr1μ xCzI:~@moti¨[0DzTW}DUu/1eLe4X5g RzU,)`gu/~\p2},mnJ< !:Z!)jHB Č3v ߘ\դtL>Á}^Ӡ ό7w& `9+keʋmH 4Rj ): pDȺC  {B41V*qR.gs"a9 Ch)sȝCz؎La:zLޟYn{mv`4Ǚ[$»e9Yl;d#ŁD"wGGXnj! 2 Cng<zZG)t5y76?_Z&vzS:e;Nn8q6_x[feʫ'mc:1^H+kb:K p9\7hBo X$?n0*`R{mOeX&n65I`ZVUdc|hxo^XUqgvafei ZHsߵ8ux7'e ~mKJLdp&ep+=2#|:hYO8| w]L7ɭWG8tqYe!12;GU8ZOY\OV:[wȘUPC̫ΊOя'>Dܬ2VXK{_y{`?_%l^GO>L`.1O$>L\%}?aҚK;? )%xq㢒LT 885q^=9j-^Dᇈ(袭mMseX7Y^W25NP6̷1uZ/UOFQW~0:/wnqYgfWѴS|AwERq5֣!uo^d!jq<):ד?@sИgW("Us,-WYJaz }FpC*+UVc,pŔf P` SJIeW~" )98`WZZe^abDD J_{s@i :q}fh0;b9\^-At)q;,d^2#sR>WZbM=A!%DB>nob:EjC$*stPK|À0 ^v@]I@)(ʠvPJN%lqO uNBv, 䁀:赈);Pm!I5MH#wz@J?ƢpD"(YzƮvc 3)I@̔dȂJ{cd@@q@ơ"2Ϊ%CIeXD({q1cc9c&[5%ۀM{JZ!gѝ4MUJRD[ Ԕbi[L- [BZF7i[! QHV |(xZR0K mrv㠝yyۼt^wҫ؍i˺\GIh0V$LC'sac i0=໓ȽEŨtHYѻTCN(ΓDFCo bL Ha#[N*3T,2j7LJȀ5Cےb_QerܢzI,S:#SA{zhT=) @)C֢6Q1l]`+ IICǣXiP'W WB`~?"oiǰ aP(EeQFR:&ݞ<]v:%X:.t!i#z8FQQc@sjo6i1s{a+5ziQcM b =|u21sE9kzP?u޳NKiנP%5|^b-”X"U|A`NRS0h¦Q3 _ 2)z/$=4%#ȭ5>d?n 4fSI}~0ri r{AjҪ>g|Ii;uS't\ 5>uޮhD;S`>L0H5ozF_O{qfAtf0JVN(;Ӗ7o6%2C Չ"$"8ܜ_,}H\N%᲼@Gm ׳s@ΐ]_\.;{x/3~"\<ZȵiL l|;{sI[MvC@sew?7@3Ʒd\Bf0"0"0: gCot4تvn\#FBWc+Bgyte ]""PzVWHWZ+MCtf ]mTPjtutmк%" ]Mpif~{:1\ZogW?YWAvt_2]-9DSWo.AjC{ {~:"_8_\7Oj^08T;'UO{EEetKW;!áds8`R.GOthh}][.NCmyƪWgk~z},QnAc7ӗ>tzyz{9ϋ۟x(ŧ[tb~Q%tt>Cõеy]fLmAF'}6hw|.ޘ!{1 wor4~ck  ˍ}q6? ˺@)E؟V?2QHۃ_8=Z4i÷b+FXfg/kV %~r;3J쇥HEJ?_NO3g!yv씦Wۇ}|[zS8if?C]PjVjCg5zԣ<.70[^Q`j.NqBYS(".݌Yjx5%fv;(6JY>=)4Fs=t97vr҆z5uC?.''kqrd1?B:w])QOF"T~g)k!pλҫ*?_,oe{:Fޚ.Qd-ϸUη(1&bX噏|kg5`ӎ{?y1ŎˊF:'222 Oƅy>j뒐.ﲳ#.@C/{ߦ`$;P44lnGೞ_~쮦-?6\Eϓ-[Om?n ǡnsH+=e-iQbHVe._ek OaT z%"hxeNj5]:9tTJY!}7~h>B$rovckm%a .7e8f /Ql?EӧmS.E؆3/WDRxLtfxw~u;x|JfcxykڃW<|E0EpN:Lw">;>ҁen݅RֱӚifKtu>_}4PKJGAGFDPlC3Jp]3Fq+SPcU!"!6CWD3 jV=]J-Nq64CWkmiut~"`:r&x]p0"ִBWm[lpC+3o$)U:vWp Vඣ?+BɛNʶflgfvWZcuLW ]^q#Iapv/l vr_23XYrDٞI0IJlQd"L0MUO?F\vS#'2Vi1~Zvl+<ա]H’@[y iٱ^.RwjPnup'=+S:~˞etwPe)L}V~GמjYvW(⃯Ħlbu џ߇!C=9gϞQk=O JbZ7nHcmQ`P7K_JdfruXnY}BK=uafT %9'\lwu2TW[YH\K4K7^ >- <Ϥϊ+W_>Z*YJo?N/ 0ӵ /v u.+_˧3h?o*Rly@wnմzǢ[+m2t&Xp20-]۲飀ͽuqՠz*[^j>,ǣY;owdz5.P%n1{wmphf/]{`y]!,qrQWd~z GL2O2f'(I3aR/`RgH>HԔ1тFQ4`pH^קPYH/5ҙ睦tgw(nUSaNƀ-o/4upcUk|zu;=[4uM624|̦e}/Uw' ٢̵Y:̓.M24O\0έs+dP \+^?Vza^+k)L@yM>jb w Mh o3ʷaۑq *Qr<ϗ?]`N0'U@ߩd}o{FGf1l$ ~ ⾙!h*tӛ`'x ӒS)L_-(OӼ. bI2Jx6l#}V4C >_N6#Y;]_5 E;&Bc'0_̯gY-? O(Hl!j%B̂ьziQ朓(1 `K/웊_V .iXXసd}Fbap2j!,WpWK륋AyGO)T[!(Aޢ;<ϑ;*-Dwh*M'DVW1\tU {\5O)49sZ֢}L8n L$iiRV@5D  #Es\+e3-ߝao\(FHBhr6yQk-1WB X=!"&i}KI DpC,s$#-CkO` (_R&x7XJ"m!br463?-נ>.߯ T*U9f%.$e6]XwaVd$oDZ,qYgit)wĖbqK(ǎne3FAQݯh60߼ߨOEI6e Mխ[S*L }SKXH/՟7Cߛel2.7=>ӹL̕-D8GJQZ1 {\Aǟ'}R d8?4/1mwmߜ 2;=yiqmUkQUd[?a*?֖{6z{`l[/ ,yu?gjD!Rct+x.;6idsRUFe1L f=W{_hZS\_'N!ˎ +[ָOJn6dbGU6uv V fBD Ƨˎꁅm݉@RaйV#gxB _hp l#bDH3pA"l$TaΦck䷷ƨm-,xY6yh @RAYd:̥px!$Ydpˆ!jkxayuDk o66&ѣ'Q2..9}&P0= [慝Ǫ˱kofywJ)|Ґ[}jj?OlO&V/[ȨvyaKHƔARgkRV[BڌuH=Wf!a,HDϼ6QZgyh0  cK'nHO9DŽRnR@(rneZGEdQ0A[R.k%#RHDc5ufBw))Lڎ)>@1E1%}Ƹ?g$u)vI[铖 =힁4mEPZ3`#iƄg(xT?[UA?:%F\w$Jղb (m) p”D* Iq&"磗48=G{&c_~wf!zn;kJJ/dEuzS2n_6+DZ&Z0;88ktUYw/d\Ybq~-HX̴Q*c2 LfQY [3/䚵~t@JR1'8 CGB0V2!BȬҜQ0\= ^2`k U cJ# jRz\Tkb-SyY+@F67iN&eVCE|1Hӏ`} eH4{fW?qpёgGe2<3Sht-P"/aN(E3fͬ$C>p2=PP"RH4%/FKqqi*QqJ{hJF]%Q*xR|H,h 8P#QAF}ĜϠ`|vjVY}Ƙcrq9mc~" $cLK2 Af@BqcZ~k6n&jVQ9ޗfr>Qc󻊕U30)|KT]qDԾlAb 8e]M}6+t<8FK(Ĵ5/ 86 \|/gfY˷~['lTƪA VeF97ӖT[[T}\Q01s((Ց(:[9 iXѩb▱ٹ$[T<Y,f!YnuRcʠNmpF/jJ}Y&&VQR+mȲZnmqA6Z%F)ee0oGRTI&IJȷԭԹ#SXC#w&@اS Rpc)<~e-'qpG%Q;L:Heb*A 8HpOH1D8T}G۴<8b˘f7{;7#?s:ӐG8vU)U?O噀8 1aD$5M~н870><0멆z%a y2iv4ܞc=Mks]}lśyrm9_/~uOϚl&Hg5ߜp{ YN .V%V'#d!5Fa R#߻6Ew.sJLN3*ՕyUB(Oi l> }~+uL1=AK5/&~o^7;{y/7?{~x};y e95:GC4Aۄ_e0૤O+~$ۖT{Y]?V9~;yfxQO[(GW@Ɛ~#h9fYwW*fjRװE]HU.&`tU,HowE^gPCFXLm^^ %7c6rZ'^&??I$-h-=i2 LD% x* “@1(HΤD7hSp?=bҞ k`u$A`وj%  9JBB:`+"K1hSSd]g5'"tγV[w(SP9cͱQs&J5!H*]9cq+kLڳ& K.%⬉uHTbԁ Ls@4 %)h Xm`ҁmV@Xgtw*Vxc?ğ1>>ixK)bgM62ilz:f1BI8MYQC ʰE%d|Is][^Gk T:RDҢBtX -E|':>}.~Vrook .a]IT<)"CMt2Br/Pʻ\! )9')T9cKt 'zP!ǂAVe5V )wX̜Xb-cW,PXXp|C-ônF~"* {#6Xx%&"7*8FƢ Hd"FwsRDj*e g#h,&Kpb&&1Ȅ2AKQ2g3bA?̹ZǮ- lQeZ9]\ M. 'TTh *p.[Y]6VDRYﴠ22C E{bMKc{1u!@<:QY[k~  b+"ˆZDlq |L3YID`dr;j4͌8gRYr)!RQ&IEyPRDCQRţ>ŭFˆX̜͈x}h{:q¸hZ\lqq쾐Ngys31N `S$p)+ Z\|\ܛRcWpa%$FlLE]hFoJ$52J1hښ"uI{ST)ƄbJ^C)fg=~%Ҫ#гkۿڠ>Ea+Z;yJ9Re Uldxs\׍?>p!YJU _gم l֎2.dB.侪6%A!=#뼱1*<K+hC>(Ϲefe&P7y#xê& -"8tդ+`$S饙Rl&\~sǹ܉81wyR`^cj<]$05{] 1-_N|Uk.J ]{"Zw_M4/z.IP&s*YrԆTeq/geKlX SQ;>\at^vwiC=q.{3]Ji}EFI.֔z.hCL$=Objw>ipTȬn%[ N(Wv|Yj[#]y <6^b)Y| FA|꛺\Wrɶ/ /-Ჷ["|E׶;^`/L܃S7,efk5W-hrrmVIŷdZ@=ZV~rVL8vD+ǙߢtܬN~>v!ZFSPHd*a@$EQ& VkE #&Kx.ʶ3p(ZEǗA>S<)}LV( `U`R@ De)uj\`X$B h;o;aoOd/$7u5oyRJWqB5 +DtGWO_`Ov]tpm ]!Ͻ!t$#jtMtE <DSRv+#Ais[CWWf BNWut*FE0c5tpEki<]JN:zJO4lO>~\yCO4~J۰dt%;:2FR_vZK$k:]!ʵCWL )E^pP m,8MAӄ0rϳz}!)ړ;[jzi[jz:m|ZBSڀ3Z̨iMp9iKhlz(B'b'+{ ڦ4ttJhih_ 2B Q6Î]Im8i,CWW&D_(׎Е2䳿^VB=tpek *tBvtJfMAN[CWWf *tBZut)k]!@Bt(Oњҭ+kZ "?v+g׫>O,AОPʆ-dPЕخe ۓ/= xS-= ]=AbZ vIYo/*4A 0Zhn~,-.wչ W]YXٿ_~ 9s >0ݟgmN0]{m|2*?Th~yiUʧ_x9 pGےny[d4v)kBz*OÏW7n'ri`vћJ]y-QFUѕnhny=mSTdd?+ۨ*3K&2\swn}~릩lk'4^cR'PXbf ~ ڨ<QJOuLyW)ngu{ [r0KZ8U*u< 0g|8؛^mYa<r7,"z3H?GRzw \|h]_GG` Íܚ޳*C8/ŵi0TzAOSե2/2x8L?aw~vU&gnr4mk#(4M!5= zaxU6V 3̗:~?@:bۅOAkj(Rȼr?rJOR<-CTJhe( Pf7,h[v\Y~lv5=?;L V ~7 |-%Sh6v&| i2:}Ğ_]OWMj3yfkYeMAF*of weALyW3z.oǫif_PTlp~ggtMհګjRtK w' I! 4Ľ"&霩G#,Jrk9v::k} jjt;UlS(#(cǙ-0~q~SޢU#|'ԳB<<-^NVƗ 5U+|ϯ>V iIL:4\ .04"訥:YAV<L+9HPWg]6w)[l=ޚ A172Y%]B 6&M 9 #“$Bq*!tWg>R\33xCmJ =Lb:Se,wVz=S[EܓnŪy\3%D^S[\]+wy|rSOnoa4(<EW7UHR^ۜ{dzF9t. =ĉ0@֋QxW2OoŽA6i+lw֘{3 2΍#R?.ԻMtf{V ~wkBR k?]Iη|!Oy-n{;[ya㞋xqgx DkX!\VtwA6g iiv[OX;;7vZA-#>Twڮv|g_,ƄJݛKxg꒷<Ēe4f'sl)ʝUHCfpDJ zj 1 240KA}h$8gJ:>Tg ىkTp>X,T]Yk㳭~(ut%4]m ŴծE;"ӻ1KtbWnǮKDoEqSJcb*ԬpɨBYvT\~(snFq5'; )LgS[f0{,-"PA-32%\vhiJCc|P,$cV9I:쥮UĞJfD.44 3xeR11<ә)ђ(j֤3\iS-Wt"D-_cy{GGo=unlKaYeO'eP  @dFQYˬT0jG&p{Emx`(l.{6$x4y AX LĴ 4mAgmp"+dFS#GHc ᨰҠpߥQ/s;κHM&fDt& K"DMrx5h%*6DIV%ߙI߇YqcLzzf7)+\`$CO׃rqQ;ǂ nّ`'%$#MI,\*ͅ S)CMԧGqgr ndF SoW˙B(/ς?afozNL4?y& y1{L0r!9=L{?>jX)\ʗ@^~US߇ljxӻ}x҇SNFH͢Fy8xwQ-m6A6i>+cpU[Aȶ\bbZ̺Gi4.9 Z[GhxlmlWZ*#K]Vr[[F)KZ,|~OWΧ_3gO{'< Fwd&^e6U{apu_/{|?^߿zKy_}su-0K7)^2ᗽOG4c+AZN[*PmJ9d{q@wY4"m "H] ln}t}0=յVyK?/t _.cI!4qB`ۆ[oGt=#=x\8I4iW'ɐƨ@rqfHcV*2fAS8SFd຋.N}aI.E=4i. ,m:љD.s  Y'`8RD6e& &wujy3M#w95fwȞ3 ;'yTB,?8;dc4{9CyUlUҝR#ia횄>[^N}־3r_3\Ϊy YVჲ$T>s rVUѦ-b ZX#8JgF) nh FB6url7 #QAul%hϺDŽϷ'ǕRf']*0=!<;h!k ku$ RmH2?K?@CDBU.1*/ךPaGMM#T=NiB\6S%uO}~lP6*O'iNPH6y| Gi" :$p['cJq+VeNPO &cEY8 (etɚ683c}Z6.2BY3ʎ ?+9m6GeeOjusc70N7~pp0}-\%q%&:bPD*G㠴!/(#^- =T3MtP+5Qg툱A,/\3caQ\ 1ڵYDZjfmձv`wYX.>Rc7epz\ $ *K N^b=j#K]FR2(C* ,HMaXF453cĝ?09 níB6L d߀$?Zihxogd %DiqS mӶ^AOnx1:qNq* kq:/ VzI 27 srޔHN;kd4Jc,o5ELhR /EJC)֜5x'zzS{_~ 9}+p!YJ _gB62 .B.&J B"-' zFyccT9yBL+cT!P\ڲ[32Aވ(@7fVD"`#j҂F0L @; Ǎ\#ظ}rYo##džiH0v4u xi*n堄=+&vsC:S0D:$(H9w5؜ՆTh6Ԁ`w 0BpzIyshqZ]Li7O;iQz.##u?HZ87u԰pAb"yr3׬Y|2 Y]/ZRծȓp}p4V7.ơ.ViBuu' gWUm|ɍ^N&ӫIOKn.vn] |+;Zlپo,K8 ǻJ6pZ;n=jDۂv@ZO5sP^Z_~ p۵&R-gYh~F ˅[JZ{N8`2@"S9 sĠ4ʤtr:A ,OL!thgJ}oZrWy(џz]g>U<)}LV( `U`R@ De)ujZ`X$B vBui{56<^?1xsbܛvoȿaa2orq&a^> ⼚:jЖZQ`tZ; ACÑ=()@06̰1 י̡+M}LNBPƹIB81 L$xZĜS)J5n޲-=y¼~ 0"!bz >D=d9KhZVRQ3x RP R"in=Z?N؅rT [$̑(%S~2r 2n.JqeR 䴉4:*DD/Y")MJhAZ!A&'u<3,T9T_khC(!"  ͎G*{OE Gmt\Ҫ!&ns bP$0e ϔy-5Yyv&'% N-JlbZDh0N b PRsp18?b2quB2lF{uYJL x^4mq3 rQ.L܊bч}4劉}OV>)>(VDhEGxmМ)s*"V1'Mć QN=V`쨒j)揸z1~u,R7E%K{N N C>y9 |'TCYRm5 q6 @#X Áͧ=M$Zq`41Jup9 .IQpi8e,Eh)9{]3H$wScz[炏 Sg;۽y@>;p{8'BGdPF,>ڎY 8-5Ӛ`u& :3@@"XH>6$"K&( hR9d!G "JC4c22!9|89t $g.zy;o-k˵9&D(h$8@'>ZpYȫMZZ.ç3/E HĻZ+gkAjo<Bz/Jյ,VU= <M5w *96ռvūM/}T -U)ژOJ&%0DŽ)&ep[JnKKcE˦ .Ӹ09.rRo,1b*/JdvhSjՆ R{U<)YO.8LJqcKF%C= !:Z!)jH8č3)?d\fR:e|qe~Gvv~YQ7|o뭽wZ<<3a綶Qji2EY>Z$ @ԔhFk  !Q` eP4 X|IiH ڻHb2K@ (E!} n-~HsCz9G$O8"==Xj:ݺ{;D#&D"8OF `.z{1h ;!ٿ,_ֵ_ 6uvja'xe%Ҙ> ?)ͫIYKAF7Q7c#}B5ӿOoJG^B8vZzlghyBq[~vfP--ziڱd~GeӥF5w$?eb%U^ J-X0WʰLOy$0oDϛ*3>oyTŲSvafci 7ZȰsߵ-g4:y}6pZ)zK%M\'ˉ+<U. aÚMcTh@w Вas%{*h'D#YY壎VdfYt1zY}:*E1ٸ]Q  GfA\x+NMQʧo' gKV{Vh73Ѹ]R_x7vj\T.z7[m5Wt{y2ѰͧtR}>8Mj-@ L}:&:Fd4kJ,Y&9xq~}{_A?ټ ;جcW˼gfg;Ծ\DެB#r޵ojg7ݝ\kݪjb6PU_l>[%T ^OݰZz+g|˫ϦhfU7WIyUo޾)֑BS;Lsu:̩$BK>nSlӇ^${K܇+Aéb5C3Y (ךY +6g3?C$VPB=nJLCo95X#kcCnS @_Lq`%_O-ݞkrgV ptδ ;nd8E-[ex}~X:l.faq.UATd q,$1!60ٔ,MVsVTIokH`^i)p^cBC_#*MR+͍_^xoK^ˈLMT~j>j˲6Kmh<=FK1Y2- Jצ\g0-X=XYQM:%bfv\VЮFa6{?]1qllޞ[?Z|y:ɚ3õmG r޹$Yz_ aw.fg6%{d")[kym3@wqNU~UO~ȭd]Hū`Uq׫ɴ`]좼}`wG ::qYz}Q䡷%|mv;&s;_#E? c5)ZUa]Eer4N\eNͅ$ɤ _\;9g_7^*[ɾ=oWh/kyum{WTKM -{m1Yڙ첕i|JY,ԈS{%rFEK LHE)O9EE~Zi^tK̝ S-8H+̓.>-*7gT+2TrւYҹ&#JcT8}ZNQhYh1AǨC}/SԚT7+lE %e7ViRň5Ht 8ŌilJ9ll\L4dH唄ŻB#%D> L, cVB\DZ; ۼFid*97L*ʿ!w!dh*Si-Fd1;KӯDGBF?!;F]Gt^$YIEVH9QcFgTɘu!g~[#d|d_U$JTVΉVw\bE1\2'%$o]oazHQچM(%r'Tu]BG[GWh cVNF"MEa-pPR"c%5XRk `G:k5 C]Ѳ#RoT+\2S]E]5+ a_ VX4o:tRZ΢o9*Pъ(p{T I?Yu AԒ" ߋ3؆qBBRE~7>lPRr/TŀʆvP's42O*1Xd*z]co=d\Š7CAwJXb\aS`d%P<1XY70aݳ( ƐL5('Ā:&|K@Z|5t&t/Y{65d EreC7;"K` l]GC(PSaHvtWn{&KYUuA"f_V*.-iq搣U5_Mr^KI"q`3Y.ȁ L^ B=u"ʈB{sI |fp;Ҁ1MQKQDg̚hN;ƄE&;ui$%jB6Us:\g^m^ߗbiW+[h] mBDJck*uiolJ` K?+&{H:J4 *#d :ft|Y A-9AJIdDO+WV&dejEN5s4o{*KH'*AP$nCgp6ۂ*7\5W5"WFym,DANAO"]嶻İyh'.ORI/2V_? ]{ #BK~u)}NbVoF%0|w-,fGo-~5(Us$ZVK ]Ę3$a/gFo9̈-Xdpݰ(a!/kt Cےb.C]`7"Vh-<0^Ϻ((U Ax Cj0FQEOȠ rs`-\߼]a۫#P >?>+Bq8X4UAU7i wh 1b^8 -"x XnF"z? Ru`Q-0_(]YQ1(jS{[NkL+~c b %OAGpe]LFALFp Z،T%.dPuN^N}~sPBTia_xZSbmE"1ib>κJNae Mf@h F2~]HO34% 7aFE.sǴ!X7kΦ2iԪx4"r9P&t,<ʬ!Cp$5Dn B !Kxu_];S9з,0H57[Fmۋˋ#8W\ b0! P0}wnxG{Xd(K'E-EBubHߥrǽ 8vgw~lzRs"χ¨lH w}$tIo h|I &bI &bI &bI &bI &bI &bI &bI &bI &bI &bI &bI &]~@DEE|HhQy mt'Z2 @^:&bI &bI &bI &bI &bI &bI &bI &bI &bI &bI &bI &fI /u@)lHυF@ҋ$зHimbI &bI &bI &bI &bI &bI &bI &bI &bI &bI &bI &bIoBϊ' !ntφܘ $4IoRI &bI &bI &bI &bI &bI &bI &bI &bI &bI &bI &bI &ӭZO?.V~~7pTkYrqqH|,.ˋ#rJ:[Y\.ʋ,nٞ^ :EǴJ9#P&|# 7 8nX-/ϩjpq~vҞt\Zbi<ɟw>61f pmpB%Wfֈu#0և̮s*T.?^Mnȼ)_s)LE`e028շ3<3F3Lz}|zZNoz̪#9^ޢUOD9Nah߾:n/9bOd&ڶF㻮A{~G2˗GG]egbB-\ETQrgP$yuQYKՔvqGw_.-ƺט<'?OdbjrwU|ݏzX@?^ѿ\ܹwiv_x -Qi_Ώoǃ5]S#tN 3wwwq9ZUX0}1Dt5sΎ| bxG-_\X|2t -ۛ{Bޛ&, eFHYJZD}k]P UUUEuA͸g | n{v}[^TmT/ /r/Su|vlmZԭ]xCw̉{)~Չ|_]]ŏ֋ڪgbvg9jxz43XH(KGŨ!&K!nhj,-$[ȯBfkm#IB[r/ӽ=ta02#"-)R-JodͤDQIˀeUʪ8y"*Eg̴ 1ռhM]"=hRSڢ@!m6 y~r"׮|w3PGp=5=xR@c$0={Ab<|5 gnWo.~ukB(59mFK\MRͤuoN:0u|{tZi,VfS™^T(!Hx!KT!Gh$:'B$&%c7e-fE% ټ•S\{_G2Mg۽=)]C _6<GL<\rW 9YM$,d``Z$.QE!_eVNF[N<={L(}V485g= )I)̸Z"F'}yȱ8媃?Դ#U0l]ISUrA*NTup(ef. }J y#\NӴD!R#Uk6.")s ^Df}ȐNWLTIFY/>ԝu ^|ş:հ\E=T놧y;ZOgc *@M% ::G}(t'_ 'G4r%u2U?A~R9\7d $jE4=_woULC^}/.z$z*+Qi-:|-4:SbQK;/x8)]QE;qM˽GkzT_l7ٚh/Ilq^7e^/ 'CH`z_xw=zwQw=Ƹ3F&'/6yddBp KB09X*AZhA'ʅ*8+pz㐒qLOP]` $#K֜BWM@2U; B&"$exΜ/Niu\C3k[f&'=xb{۝îڹzjEת5j(ye"jy۳c'IRe}w< jx>Ef21Yu1S Ͽs%JIPRQxjE_Ռ P1cʐ$J& ‚($t*) KL:@66-z6͆eHt'dPMSx6l9v ;kIF١q%kxkD㤉slQ[j͆s~Rfϙ#Fˡ}d+ytٗ>=!zmN]Vwg<[vϑj&t6_(|g|IxY e`8B2DH9])[,m͆s?2*Ͱe슅1 #v~QCZl52@ءKd(M+,)D:zC`qD`H|6s\-A÷{ Q)*lj(ʫpl'B,d]mTxȍpG<PPuڮ1j{GYϫ10PRC,0RqQk$${IMX`4I&X$)RGQXA5Z~p>cg㺲}:>Dlf"bh~@^&(]q[]RB*Zu4ɲ,cmWf%r)cKVvU*d&x7KТN܏uf\wK:\g3+hq[ɂ"y(6 \"4FPQlA \H/QU\$GYNU -g׭-)%tB)3xn3՜7хdL^X#, e Uی6VϡZ#ފN{괫D̆L2**FE(P#B D>Iuc?k6-/|ʛE/6H],2ƚ|.WED$;VJCȦDj](`F-]bKUb( d6JAf!r6 DNyZͨO'=/;lAbc![Q3RRHJN0KFbX L%@L pt3FD7S x++I.v2j%i:r%bUʢxNokvu+?Qjf":_h|۰X2 p!Dj3lNd*昊,5Aܣ{\lf!zGl{L^ iyY't՝@#HUAPlҲDnYl" sVL oyit'KƔ.:l uk-PM?T?o5=Dk@i,TcE))* v^CFJD1$ >pv̽)Dͫٷo4{0|}J3tRnvt;cGJiώƧ|]}L%Bo:L)Nl@CSQ"Ԕ=%"U"P\ EY|-YS0KiE+%VPE|ĒB[uDmTZ:&BQc1S1e !@Q9{Yng"d6>ע'm93ʈ.:I`$쩍?gD+>ޝ^苿/'3:Qbh2R*}F"Ut*E)hG!"65&Uk׎M^|)R#me)Z1Ɛ6.I_(Q])sKTYYU=hST25)` ;^$/ܜ܂_+V.B;08o>Nk +) u JW :1wQ[GK=0 `G:Í_VSwr1;E#\ f!% mWt1;4H?ΙKg,M7v<F,I g`] j&] H-\QD8O=g@o yH+EeS]KJ0@ @FolM4 Ot=[͠ 8^=3rxN T@)R@ p~C^ ?:_%́8}7ݖ  7֑8?@ޱ _jՁ;W?{F?.m$7 \v?Mdϊs5N\ S3.XP)N.I&#`VQsO)^_okRҨl>(M6#5٤J(%,C^[e#ͳŋ`+B) #C6L O.@(6Ӕ-rpE&3]Ŗѥ|(!dLnQ䕋8m5ݨկOqVsɅh-URari^nUE˗5rm'|>_Mڶt65ȩ̪摼~S1+kTdW\U?_Nf󒊂 s޼(Ms.P>{V FaGxU.HkϯO ex符 39ƞƧ'C=1^fXї:'ͯ^kծOYɿ|8$rao+Z4n lX`]Yb ?V)W9{Jrg.qz>!|Z N&t př׵;_hsy:7vF}ɲӘf6~6*/ ⪛KX6oiujx[pE1fW1 ˛QLVSb7쯧3nDms<kzƂya 1, oC1+I&fRȆ}[Q̰n*HܛwX0s#gQGYZ'=Y\g Vf +JWK$ljNƹДIN,i$jAqgf[{r`;ttAwx;;dHQs"ZTx2cU, N$̱-#De <,6o]=9';ą57oJ{GP{0T#pߧtp[9QͳzJa)t`M x݇[no+^%q mKw6KګADQ P#k)AJM$vk8QC743 J͵[2Ѐ`c>al]ȹ >zH +Zgg;UQy9:=Fz!RҖpN;@2Eae""[zm` (tX{mpǻMܔBw.=.*51pI! 8,[ģe:%*E`%TInkV^zz?P]IehOX>ě蒔U(P H'Zcv#f3_X훋Փ=a'Yb]L rLQD Ԅb.!^JWaj5l {羣nonw#eۮd f\ܝqq՟g G*\8RJ% Dt|9K}XdJ(1hp_aKqչ vБ-GCJ%6h>jSւ"8Q`%B8y{oo3*—-,7\J{H8IsHu5=q*G7?:Y(j7q1+K1.uG8+Xv$]fG^frc:R(ؕ*e16 Yʱ YdFu굒YBoGxYE_4oV>3R4mQwWVQ/7neU䫅u p\7jo^GQ{f._^؅M%cdn^;pޝw2S9o+u" R]IP& hH;M.6.gɝfYS~g>zo)9(.;d8He>p Bo?\1+G* gs4{Tӛi5NvN=ҡ͈/ͳq''m[vH=FTF/_Z.IOxҔO1VKg%UOtjBEr^FT,]ALv%6`)jQGeɜPF:(Ad2)h{]ļapw( [$mntSj;j19B[RZsvR ƅ䍍4zdD/9bBF^ Ac)tDγm%ĢX(^ /Ҝ \HGY ڄnTp}9O 憞הB!,P^jg]X ZOzZ/ʋ&\\nSihw8dkdi(BYKng;sEx~ى8D "8OC|i |\C9e΃HײS9.׌9R#r-MX2rBAL21j 8;ΚUr3ٱ F!K< "Pkg Q^ ZDx`HV(9ϳO(ۋ@"7ӏ!kA(&1%GϜ7 HϷE QiCΠVIo'=.~;IGo{5Nd"`&+BIIzщ86yW05 z2m#Z]R% i2j%p0 9.v!wvDmWX0(8ѓCXd[3 io+-T"lXT'¨ȴ1Ȩ́DbuHy c桿A::mv,{5A{do,RVE!=;qiI$!w6y$g΄{On:~PJEȾztEp%B:cJh^ ) NYKpحg xul$FT2b 1J x(\i7V Kp6b=rsC 陸KPewbx8'p1w&هv9×O%B\5iv!Z=:Z tt~93)%K1ʳ hDNp9_$ zD@"PH=4$R"KIItvY6S*I)p+cDm!y+N\q  QkVv g^nlƚx=0?$|#A4J:.gb=ӥE~wν\[~{_7<^0󽥶4ԊxyݵwGn^<|*o9ݦꂏP]7,SC3(#n  @|\pڿhnWބy$pC徽#pWf fF~UUz ·:Y&:HA ($\$n4Jx~׷1E鐶v3y@Y.\-O:I߽' vQ괗FJySV wET)C:iqL8@x2/X^1|W[@Ԑ!)^)K#  (E!R}N=^,+u2O "pi&Z;j2M6۲>`<OCA"Q@A 9˘"$QfQ/'Z3- y8sC>(E|o'h 4*EN~q2ԒŻ8U-rqݠz@կOqVsɅh-URa*ari^nUE˗5rm'|>_Mڶt65ȩ̪\Ǭ?.>ZiRj^Q/r-fƲW|9NShK* 9O|ΛE^inZ8<ѵjg*]ȁw7 q">b c_=>&ؖן.{z<\aûzM[Lմ}aUܗo]K}](C"'C`מr+| % 5d`3i fB3?8Úv mN9UփٮXQ=MJjI\4YI]ѤRyK9pKܭ&˲\#a\`JY0X^6Se u@5^э_F7ib?=ٌh=_q6C@6pӷzi,8_υΧiG)tհs`b0c\{{Ķ0Wv45Q8]p¨B2WBǓlFT06scx<ť*]@u/V݇x,8d}ަBz>oI7,a 0|`i͝P7l0O*a^ur8nyX-޼~3Am"Rğ?4*wzh[Ê@*i'2_u!| Vcg}`A])(MoA^w }Gv -5jv;ujݹCr'4҅'ޖnGbB]ptPjxCg~Of6ybUivhnw-n x_KNȱy{8b T{u=4h4F8L6`[i" w 1`IɟvzbN9f]P<HydRAYd:̥pCJ-HrEapˆ!j:E% t8Gdi?;g*gą'N5tROБq|!2-9F沄.;rAwS Ƙcl27 "@ F/pJÃ&`3t$ChTfhe)~<q =C>fϡА[BBBsPf.gRg+=\ۆ7rtSTmn r0Ҧ 3C%8[ j!2Fks) UNHET aj?GT:aX*rK痃P2㺸eLc`\b҅V:F TS ׋6kY&&VQR`*ƌƁZ0cĜE NVFY W;N ?!kK5b VQ+|vA=xPHcEfsI)Ӟ"Y0tpT^rbt/J^{z7rŚRuOxЂ1v:|:cI̽31s ל9ҕL#QHYu2L-n:L"^ˈQbhnDd>VgMnX4[!Nt1j-6kp?hZ9h6{L2'f>'xk"6D JF )\:K0#a #Uw5#whj~D/t4,n`[: ٮ\1߅p1k?V;E1KRD!4J(.۩f!էꄭcV%cU-\c&㎅Ɣ|I& L|4Swux"Na-Ku+{()qZFe)A@Ypyf΁R +Gٕ+;&Gzrrx}"vӤW^! Zf2C@kRLbjb ?X$V@&$oW]/ìW&!IM{l~x 觽;v[ 7՞Kש~#5f*ߍig K{a@ڔAC 9K)(ԝ4\tJ, uѼ[V@Ԛ!XWc&!wQ펖_y(+pe$ą+MCu>f0%o5U?]WXJCC3wP*(r;r4F|ٴI}k,!aAN|[kSW'OkgMKŋyrm A˫oە ./_^rٹP~j+ۊ!Z̪ U: zc.:zzWz08[K%hwFVIʙ-#u8KYHK; F==2wd*@&ԙ u W {ӫ^~w߽D]ˋW?Uo*Ϯ/U*>˟[qKwTYYTըR(?(KN ʰj+׳T` H:=Jh2khru¢Cb9ǢʕYRrݵT q"_U5ferI$O])%b =)3Pf=w(x& Jb RTStd:yXX`3 DB)̳$ $S;rsbd4wdF׻8ty[y6yu˸(*){dJ@@sVkVi90F ]AՒBHX&ͻ&9!/%BN⮉d$2"KR ;=^*AM2iԆ*g)"7SD>!ub?5B}Ć[0LhLDǘŖtiL8Bql50$5IG\X <VaF l-ʩR ˽)yf`) Sɱa?؉V7_MZoH8t",\y옡N.{y A_NuV(LyEYA)&zSNq飶ěUh1g&)Y3f~͘dӅ$X]3BGM2nTY7>7(m%w, fnlv },yir-r56\eͶGZ/q%6&t ` IzL&HBb:ջ ּJ{ݕ])N )|X&q-Hp9ll6ZTVfDO\tdXNRxl{W,ʫ—O:{M*`rYmxbx΃6VB0ϒ\i⭳x+ZMޑZc al}25-hmL wp㯊qz(pv1Ÿ-% aܠ@8Xc4)ļwXds)ʘ9nN"Xo}tI;rd`JD8lD02p4YfDjGm5 Yxs>*Ʃ-sɠp02L?\\]oL^m!uQk䥯^7?=q<?otek9DFv(/W?5E 1^ݳ7sz ~LO=GF6F}QaGF}QaGF}qڅC0lR gZN,yeD47ZsA ݊8!ݾ;(GD"kѯԝ3Lu3]ΎK@ʤKuV#r|!juF%LO&XLH1W0C>|*4w1# GŗV<q6ߔP f iS#KW>?)m$ƋOZKpym͓ 8Jm.iLik d  z;yo\?e~>H ^5ϖZIV.pP)qۛ94 uޟ)l~ݎΏc0 `7~\V,FFZM7+⭆;!%~$>U|9S]״W j%̻FŽFn|~|ƛup18_O[Vc !Qt=k/F950Utj{M]Ոjf]XBL8:Fbǣ@OWu^\^vuΨZ]벓]vHR9)+'m,WO)?OCA$v1<W$Ƿg߽ߕo=|woϸgoҮ7r MI WTʺ U) ~(bKe_NV|~(7Gi_]&\q#XQXЦ\,]?w%\#o덗U!0~@^jmm6&SvvY{VG'BeGτLD%&0"Z m!se{;KҴcr:,I;@&o%ES摤[5AJ2!!v0CdNs)ή@LCQ֎Hz^W=9աrgΝN-yɌ2iZbGcT!c]q&ƅ,w\g>xNtQJ∢8s ,x:M 40K'c9켴1K62,w)!E2Lv$sNz&+CjyISeQL]_;ֱx(ZŚ=:[="ƠQ9`ȶPqqqM+ *[I/auE@TZ:a6V^՚BAᓞe< Zr/4T&N=qjOKJt4A&eeZVzHyPi-?%]Uͨ LC`Vylʒ R=5 "yL6g8W gmX9kzX.&BUY^Tnpή 2d3$4.(@Q>XԒPP: %bhB"N)T[ dstab.Vfa(ƞOQEmJm!d1 +d/3jre]9k05Xv5WkZ[ZG$RL!q2cUgZ%;iY,W}$)ȣ*>JUB&dȁ(ѐIȢ(lbB%ɨFkVk5ꗹb<XM>ՈF4F5Vr ]W(B]I^;,B`!ñ1mlcT p \L lb5jn$M0Z·S9HPz4ՁY,cuV=`bwzCF&,:&|LL8 J ut@짖ty]}ATXy)g^{M1݆Q)_( syprƂˍA  Bi]@ 6 [IR)$S(b!^UDIk602`R]>u\8-<$@wzO&pe\]9 #y| 1 ܧWFvfqBm#˘=;Y;yNP!0{>˘`$ w1H+D`":k +ISAUaˡ,T]dMlN\;G ޢS[r@f!TӢʑ>C֜WVx<]Wo?bWzAݦ[vEoYe 2;i7pR eH)Cɼ,|t>(LoT>2Q:rz;e Xz <2"c<ˀ%18W4&"lu"N6K͓kߝ_qEFqG+[  ['`mV wn]S?_GFQǩ} NGWG0|y͍mҮ%]}8T>?t/FJy !- L!I9AJZlaV Io7iz)|SRzErϠ$EMXW{ܳnj£ 0׉0_$N0#Gf@k,T]td|DO&l?׭'f^#*4x \`V4 ;*!gI ~Y --zӦ츪^^T4n6PyMMF>Ҿ'qp068ͼbYM6^!@ndSM@뛠AUh$8/kDWg,/1ܦPbHmX`$<%.Yrt(7 7'ttsm?9ãYi4.8i`8R Q2Ct< MA4,Ϡ# zHH D 쒱Y{%RgB?s-8WL%y*7]m关h\AX*T )yL,+*vΐ5E,B00d6֜!]-tLh3]fJy5{5>N9 ھ g>kq˹YEPQٟ@K7ЀX|.=vïZͥ4sq9g]իn;bٱ$Bv;67󧺑589 HNh-Sv"ydS>I:M5x&%h{L;8jH^ f5mܠGLqR&hRp0lU/S NLcP߅Lv<c蕦wL LLh}J*'rt%2JX .0M{{,"쎮;vp-Y:>޷R,<7=_,ςso Yȳչ8e "2ȎDZ[ά41*gma9H5A4Bn^1ɷc29MD֎dc@E&HBPVHK"BkNM:Y㔐|<x?TbcwM:ZD6 xb 4xB~ZZAt g$Vty*uVRx(dvzөzzid9>)nbwB2 !df1({SE;yztOqWђ%Ak0WNۄcdDAv?adju;,Y5W}UrӢ#Yb݄k]oGWr1vGqٻN͘&#AR!T3l%s=]_uWW mN(g׷ 0Lx)O4<(#.̾p5LpbbQi/r(`)@7M\F {EZdF('5ߧ4T6:L)ٹ4't e_]#ض0߼%`ha0kpJwqI/}<Fn4Ho~f37 . =Wz >7ÿwE" _9 yYG%! !P_*2{]՚+\Eqyuœ'yCHؕ K]x1 dOj8zųI,t>C<XˣW֫}ϋ6gtiP%$6ByP$`f~ϣ =С?n{‹@Hl##Y L:-Qt:K2wQcUޛ҈/}[_h ;o|::T^,3 ?-syՋFo_4wm۫ab* 3冶R.%Zv͆drUg{Wb,͛\AW(/b.5jboˆ6^H_M7KW[99KuGTҗWo=Ʊ]0-ofe;4Nyf:gҀRSy^eV{KEgwަd5xr7 V];y7q r웜ZN8Y%ͳM?=x$#e:X(`VɠGIuK@qZt ɤ *C^874̮w~xF:꼑|jGk ̈ ^W=+R}{?taΤ4=l?W8"~{?p1@͓[BJ *Ҽ4^蔲>꫽}}|٣r%K俍,HDϼ6QZgyh0 cK'a_oBQ $YOچ-iՀW?>` l] >N O~#:*d$FҒ %K/=Q8Jp v!գa 8#H`׾cir؏Z\ ښ10)4Jz1 hSe\J,NW?|< &!>'9Kpp^.V@idQV#AG4rfbɌ g;L:X_&8Gs/Z`hߕ`S Oyz^jWVg.Fޕ\Q2:I%UagРS/Y,]* % 10rIKD$2 RjDs)!UY M3YtJ~l&Q@42ZHXh B,j&4QEH_ 7g'_z Kr#h(l"[0aMt' joEhY1d@Pfk8ґ(ɑIqdQG='K2cuwp3x]cD"Fi fG&|^iYr9Z>xޡ-:#Uʌ3E\7fvK׍hgoaeovk^ C_+M/C=1gYc"I˙HlT cJ#68R$#Z{1l+ܵz[T[Rp][~JPT0,f.Yȏ}](n~sUKF6Öؽ4u[Vl-[&4iZ*)NmPK&T ZcF,P]L RDR꘎azZ_K-c17xSX2tiel#VuT_jeN!,\(wpp3jg>LLj9+QIZI ggrڷm_N'Ogieyy fٖQ^ zhYPjeP=S3viApgID]PPrOS b4 9]&0;ĮOOwRÒ}:xf'a2߭N%r(.;X%gYUE^Kbm^UeNE`^Bl$ҕ-5z^T'uھ'XpƔϟ5wfWQ=xAkqW%Ⱥ#?ϊPǓP_^}I'7nTDPNY3ʐǦْJ%)q`B9vNH#u.l<Au4 ǚi|,S(щl]0W:%vFsz-͝D`H" m*U^ٳGE='̸_plA.C&uyDyk;aђr_{0I`q s֜':g٫arWً_O.Wj@y[ qɬQ+w4 o^ ]M|OwW]OaTa_oUBYJ ++y( tP[ gBfr֒mزyYn{|wAXP^Ud]{1 S [$e@֟U{GM P;0xb>jJMô]YbdFKժ5)^|Z7, V^|Sm?C JϊjyC flq8߿YzhE!T߭WbdR-u+2WI1Ylt* !=Og(WTxGtzH`fxUHwTKU~TxBӒ*7^17+g|{9iBBw+\ ת{i_ajNo&o}MFwK*xkPk"xsNC}M}H5NBlRϸ!G@j5\6L)ia?}XcX٦\R]^HwzK%s ?mҨ#|Fҙ38-}r9,D6A;O8֏yINяi̼\Jf_sq߳/ŝ];[.]dR ZS_ˆfscMW,ƒw:J`@,_vjr]t/.X ډ 7J:C.e;C! *u1Z}ݼ_pKU+niZUrpk/.pE RAm ')&8f1\kJ:ũ|9pn(NYŞ]PF]Jm\SQƛ=pù t`w"Ѓ߉r ͜G@f|| !_[#!٧ *sQ*))V9Zy{9$cA+ RϕǎDʝl`a^!}B[NuV(LyEqE S\-&x#32xZa̙I$3klٮal0g yf]OFUB-ON?Ol0m_wqH e>͗⛁&a~5 UFF+J#JmĎ4fWX/8~_ņTW &a{F,>e$}Tjwd-^id/<*U5ںl!Y@NPd0_<ptkPv3jVV{@I|J& +ɡ5Ydp襱EZIh{-mc] ̐J`595H"[Egm"3&cc{l8wÞ/:Ux(|k}c8X$+yż" ![Q"Ѱ$C0^T"pH=MPR뀹12FT;x KV:M>gf NBč-bm?vq͕|ͤdODc8[&IP f㑪Nh@c,%z%+T9dr  v!}lkh~F|N5No 7QmRv~fψ~yc>b8HRNϟ|EqQAJevwcxys6Ye4@ 0[=Q^3@a{}7`s $ (\ *& \mLreDBy4VFkA ʁ)@EKmРl*XA:b#l>Fquyb?^PaHy~~v~~>OnC|{m3tz'x9j=6(}Kx`)HiC2)} %Lg ER&[YJP n ވ='ʖձ/V>K&9Ōan|"f9Mq7Uq~79v~ddi3B4Xٍ7]"Thq(RH[!/#d3rD.$VC)ڹP]ƶ[3% oFqJC % mTF6%.fZ n:_89[õ繾~v∧ObTd0I}{ȯ> ^Ւ.*墇׾uAk: ԁqmǖHEAA:_hn=(TA*dj o\Fhk,[ )H"S>e`6QPqMQ)kqN2ʕJ x@'rpOkCI1"u- `:pvܨXZC+>Z:3?9?|w[lҡw }\bEgy]R44Id1QIHZ$,S]BQA /Kʐ9^kty|YKXŗrsF +%0N-IOSbp@UKy݃).P@ ~MF]&蝧خJB/)KTrOi'ZpZvQZF)BdURkBB+bY Ю&w)Ũr%aiHX["N1tȺd6iy/ 1esD07ԱlVd-5n4/rףYiV`2~}Ϥg2P2BI~BgoZ,NtWϦlO8tc{+(KV!ɘ2. ^6WJpOq~;A'oT)ɬ탴ӈՆ<GW,~m~Y+Np5ШosY|7=Z"|s׊_{fzztgfWڻ1;sˋ/نh'W|?OlnuX횃Eme5䓷/d9c|Bllw:v66o.,X&:5+d|'=3[w]<䶻n}VL':RolX|~:=_N~Q?tĎEA[.jY&9dvxώX^_7巟7R7/W?_x.fͥ{>u˭p$,8/ ~V~=4פE P;+S2+d+sABMO֚)$ 8Eؐ1B}uL`[}ҹܫU7|ݝÎYm*oڹ5jgBw&tG 7# >-Aw~үKʼPꓓOnzyIJi[%2AXdC:"(e KM0@ms$ϟw@mվ~4Jg6Y2v|;'7Jovm{orKOށ܇lPg>7C\c:p|q4\^o -zGkqES Jz{.xE@FI #s۫NMN?Lxl7m.v@ Ի?k0r۳3#ec<>o\s^ך3z[k 5b; 'BZYhfS +MF QNE;uAFДbMFV[NqP <"$s߱| J 紋E])$.RK$%iB&eGQT$2+Fcd=iM!OA6B)]2 Z%ܺ]ml#qn*/Gh@zr.)#!cRǢc^ACXukJP.DFd 0hn3՜Wig]>2N&cfK2$4>l ^ bhk[.),/,$= ʓsAh*ƃ9&-bЍpvԳW1g*>i/uYe5ATcM.Z)}Ѻ}Q,`Baà-U"sC~bWߡH'(LFhДT2 $H+ &R,9e. PC1ovQLm쑂OF~l)\)Q V0J 1$oD(+CI#hb8*L|%i!:ީ91R2 u20yl^T4Hswyжv[Mثm|Dd_oQhkT"5{NE4Z~b :M9 N=`;$d_#|Ni]XsoM˧t)ݢEW~HM)&X^_a9Z}h>bo>#FG}f]wAyh֔sqy>ΤnᅩfZaeF?_/^k=Y{8EE/{3>By{QZDFht!z%TCdi9NgFB־:Q `@JDF 2ʫzqhջv3fꗂsض~Z 9S`t"DXJdD^cItB[fKKÇ6B>\@a€ס8}?vw]4 02-ƃu1uA.N<,gՊ*=] BD!2%)˟q ].#)X{Ok EI$ں

r!~n~5ggyGĞ;#W-OR^ص2d0#Iql{`'H[Aa8e~H4hy򐨙fOOwUW_iQV5D c!FJVf<B(!1 2 9Q*.m(zl@!jGcHXD`<b:f,q"`Yȱ2bMy 뻤lGR֋ТCg}nkԝ@Qߞ@( /aē7'L* w܌#Xeq  p MF9uARxz`nC(8]LL8;ӗ4.1e8/*.J*5D g>D.蠃-H)_+>Ji.*#JdzyDS=Z҃ob>O? U ˴n\E~]Çg|y>nBq~O:0[353kq:>1?:`)~2*GМQ8|u%Vu~7]iyjXXL?Xu#Eg09<ʼyxY1K$ nYٿ<̫JzE,\1MO-,,..5={ Mш`99B?(<"Y ^G%%suɴEq~e";X&>> !eb3,ud Mx;[NjQ O8IPwy۷?W?$%⌮E FI|774O=qQoYוkͥh {qc}@*PxF@!]2+&9f,d]ז]͒UQ).7t=nM,^@F憨dS58/b*IIv޻*jAWWۄG~A0яE1Zs;EC`u%g99KL[']Wɣz؁#c|CG7rsPlVR2w&7|&̜8[VŽjJ.k[*2~;mjLsCY**>5g7<](hXUqgwgc!5ua]BMϵ3'LnxBs4#gLjXm4h(pG(e*\(SzTa}erGiS/vARKe0) iIT0RG # ^/F/IN% M6~(Ճȧ#.y[t[ľ(WדsOtPq|U HJ0NZuRp1̓[Br ʕ<7^蔲rbO_{T_dOՖO#K83MVY`j!*̄5+-Ir~]/P,t~B?ݰD_)ǧS|lXל?W1bR!MPW#ū)v)q@˜js#c*7|@d@vGe2 (oP#m3q?pH) ^QA1F`"”11iGU`:0̹ysuap LI8FҎY݂p-ar^jFlQ;r}9tfoT {Fo#A|J3X8 a,"Zb$26Ja1 9a3RT!$U^y`x%zVޚsWF8/R,;,XPń{x Š a KchNTAYCYMg;x+no`@{6Pz -ٳL5,[#N`z)<<!% ' M@\ jp_g'ZcRˠ'p]#ʹcgmI 9Zr;Af;A̵,dae/!C{x!ͣ峾Agȩn_/^ߏy9ʠېMd}vzfʥy̿Ozⷡ+;߽$)%Ӟ0m-P[0nL8 GT`:bZ8U%l/p2E8S3tn8+0T^_sY&&VQR`*ƌƁZ1b"iZ+IpqV=ͷ='}>gێfnDy[{"'9z$fX4$ɜYsʝ[g#fpéĎPa_p;xc߻o|ק]y7C}VhS"J˝slq|=D'M;N"v6ݡMW[(@JENY(9G@R"jGTV> {9С= <䐤õ€O'e;uA)ıH1WOp(Q!,!lFI06Be'qzKx%㫄-j;oY3f$Ko)<_404)d^u,_ee:=LG\_kfi ϳJ*e5奺^~ݔ^2~{-.ky1M&{XހmN↑˞9|쪸љ9>$}^۫𔄽˹qaVv*!8pâMJXep#x oe0{l ^qZʬ0";p:ji]&/NgZڔ9/.QbqVc\MJ3ZMjC Fo.nPu7nwUG*fS-C[Auv5oTLe#%؉0!Eq˙MgeqHG0'9|3TE @+)H3i1E!GFQm K̂ݢ)n!paݵ&n%6񞻀 :qX^VUK.јI&rUv֫Co7u <`$wd3JxANIRy͠c;c޳îA[i݊a*~9_-'SxSdy[_3t;bQ&ԝP/K,tlG [ibÌ7z1^ *8 =e`O2]uۿ߯ :H ZH V+?Y|Aox2_.m2TSxf46 /Ub{@V/<;TْadfA{76-cx@ݛzMz'-]VP,z~d} zyx7R}u/{/< ylra(WҚ&ӎaTἣIY,K:R #kR1Oh&zGYE}my 2a2a+kGyұc 5z=)̃ԁi%a0ؿiûF‹ދ:`*<to}~hqK ~Sǿ3\aU/?~Xk#KCj%A<.0,UV@$Қ Y}B^4Jpb4 ia nywի!]dpu'6=S3ޓo_MULhT&;LUTV=$^kPmΣ?Z)ęe ߈ɚWB Qmm*p)VCJ W1`wm4LK/afeLA`DqxKR$ZX@6I5oU:k,]i̚=O&m2)nm9{໠kge;9]].}?6!Ymo5ɩd"j4-a"d>hgV&aEhĩ[YjQ唍 t>f/1=TjX)JՊzQ]*o>AʱɖVi Jz:OZ `( Mi>i-6ՑzH5K[[j )7Uu`:IF&MjBp- wO=×2:`2v7cv[\\T5i)KNI8_^` !L4jY_݈ludȲz8Y !scwH$+Qޤ*CdK #&xB. :ރ%vwE`A['A裃"Ĭ :mm>3i,vw"Tj'=u,s.lbqM b|jQ$EU^܍S5ԺV"ϑȔ$+V[t!(sRBIԬ#1LKcvSt-J I25mRhіbe_'̵R| 4fmG;S0'uRt0d%pMNԵVDnJ3HTG.IgV JBu$(x,;vDiF*P k9 @ѪJ(Qkj$*=09zR")4c`k۪npWO+5+ʨ&4z\h6X>%Ni5RSgE $G<,AmTz a q2M!@C9@B!e#n;clQȌ:B*< 許Y7ia;"p`HciƗI1ά :Ÿr@Vʤc@8"+4poSCMtf+%JE5nX7ά;KQ@;hՆ?5+mebEM]Vkx5=)()HX}Ju1 y}~$2y-Qd Ք&TY\d`{2 {xWZE5p4d36>7Ga,f"ҎY7IcLHQE!hpD9!"MȾf<ϰ3)烣kW eɻ ئHM0RP8x.YiWQ&{@\!i$RUFB>L8bPփ|I ANlQ#HV **2MʴJǍ5k4m2AcJA&Э.4(2#}K V'ES U_~^j lbZ1pߩqcU0~ vIͧ P _@10 9B54 RpUf}J`@;뒄 Xj,e4HX6 ئ;=Q-J UK F렝{nsE,nϯn5xpW\ b0! Rt:j{l̹~@NP[@!4'Wl6L:5_z1z hI %wJQZ@xV}J ܧ@b%+X J V@b%+X J V@b%+X J V@b%+X J V@b%+X J V@b%+X J?("8GJ 28:!E DJڕ@d-+%P>*V@b%+X J V@b%+X J V@b%+X J V@b%+X J V@b%+X J V@b%+J bI %*{X+`%?T [J V@b%+X J V@b%+X J V@b%+X J V@b%+X J V@b%+X J V@b%зwv@ 3>+(`yJ X +E%R3X J V@b%+X J V@b%+X J V@b%+X J V@b%+X J V@b%+X J V@ߐz'{O3:jJ^oׯ?|.A_Ϯ)Go*WgdK"j\zŜ|fvIr=GVkmrOІ|6||_}EW8iWNJ:`0d``/k!Bn$4f->Ӌ k7G׉l}[.J(,9]55-5C AG:3ER"hR6Y648Q}C9  x,E1"f?oVfФ)"1LɷB*|nV6p}bB `} qB%W{+ ,a%VVG^qAf`?#@{qԑ7ݾp[(zϼcNlŭ*maZ`:yiPlO?3%tboJNS1#f;vOW$3j\ArG$*[^ﴜs ieQ*yegb CN-[ؾ\^}?n,!Tbu1Hу=u@0lޛ䂌Av=GZHg?iq֯W~c>[*7t_yJї[Зfw/ǚWl} o[:@KECV _^緇\e~>,qkK; L~}iʫ}ayîn0XK/_vN."}Da;-ry_HN+#zݻa/ku>hԎb>>.9Fxf*<ݹmi+>Fr:\ j^ 7h[wڶ.y9myv}v[gc]uy=֙z^]涸קk WNgnubVaѶ ׷e\_*Z3y>#JE>[/oξ!O~y:6͗tu ٣ /US[IS 8UkެIسvg]6\5sk_r Z.Vً4zw:o6+{sXXT;0b_P/WO[a~v} LW6׹_]ɒ2`W@v坙yV$plZqJ{X]9|pri)4Yia~>XmY"iPju@O/nBIP qtBn.Gih췣Q;g?,SԨQٻ6%WyٱR} pN ps`v ;M2IٖVkHZ- ɖf8}﫮ƭﮔm'dDoyV)ʓr۾L16}>;8ސfg翭[;\^S'Z8b<E[M?BAS^+x ɳPpoiuT-_tqlMg|G2y[.Em颯';ΓZ-YffѢk*,"<6gb]ɧD/ Ι*q{Ʊ:.M<,2!țK}:gUjCJQcPPZ/6,/o?߿?TzUo<L0.h~놦Y$,kDZ9hNS8r9|*0פ68IZ#sNh~@Gl&g|B-ޅu-J h5vnXۥԛb7 Nauz;Ǩ1nHĽx<O_$C"xoIB08 ҩu&irHDNPJZY_s'=ܸt|'CX)jh:_Aֿʾ˻Ϝ@. ﻯg 2,ZjIBBP az]ɗa_ɾ8MF@y]82 bx| :`cUB,adl1cR wa/ơleD)]BR/%-JIQj ]M ,CI2h\HB&ƒ\F!u =E5uxSآyCج;[R[MТK q{Lfy1 םP:gU@! \= i) r,L`ѕrɚ&eDa8и`36(#bʚ4 T3ʁ)€6hPHBP]f#, 1n֝- g;M?QaHy><>:aV;w\7CמwL՚Ԡ=x:f$.:-/mNL٠ Z`dFU!Fj0)SlRPbր>C1e,/sJY {(ͺsY/@gl:}qկǏ} zNdy,YxYZJnN[jMؽ{}h[J: Ŷxӡ,J6 |`Sn dAfM`bBtͅXv IJvg I# X28NP Y6;ضZ(եV܍Wv ,FŽ+oiJP3mYq xp3m$r 8`M٧A `OSofR9eeIể]+Hz9Ds{,QA&e$ &DB21\()P'xҥ0-7ݺ pqoc2N`c$Woï=FNrs ^7ko(p'9MpZ*c\nX|87ɾXq^'Rxt/}tT3= A{7–)`q5A]-ǐ*Bה'('9/'\m",c 9bLtŅP@Rd{OE[9։Zunf_Z{ͱ >5zi6D,,riGeUڿe5ڗ ;BN~~dJ)hJZg̀X3GP.BT@¶۾o׭>*xO+x`0 A֙9C6\I"+FB=\ Q4U"!Q !s*섰6"eaS!ǝ4}dY<~~6AYJY," f&76$:IAUT IAYCYet<0O4]\Vx}CY3Գd1DͲbHØ!Gt&z|*55޳c U `-%_3ٷb/%g3csV)~צTJ9ˤStֻ.HL]2XSQs]o 3a_\o;p)+%SJCБ+%K J{@#%dT][ςp+Echb1F AJ D@yuglClBx}q y #u6xrC\c]```֢K>E< $+5 (umY=4xгCG%:a@J %d|d ^Bf)vmseW;兘9K gЖHAsv;Ylg^}yj /GK]@d-Q&h]߾(RL†AZǗ"+C~fW$ H'2 Fh9*2 >FVA dj5Z'=/~bjb|4eKJA5Aw.[(1L!z#B^!Jxm G1ϋӓ}j&|7ުpfօR2d :20l2I1 Rw4h[;tzԖ_|_|ǭlbM`V$Ӷ@5אvY8$:hax1`TCt0G;c>m&!jGh[tJ8\o[oPH!䙂M^h.Cg|h3Ei֡ }:^  x Z!Iy3Q;MI%K!LQJJb-@[' D h\/+){|ܝ- 0H/%N|ؾ}OHdroP }:.OZ;=: ^z|t4:YTw8yɳs%g(Ny;@QuhID$z$OSC"d(ŢQdLՐUFdL:BJTgR*]qI{'HdwDt9K)+K,mXs'X P})PEp>}Ƈt̗/AU{ "I쬠؁y@ ljt- ]W^ǻYDNfg[Rc~|}M,@.Z2\`^kT*,}=bޅ6e?VznY뼺) G*ۖ$h!)5XdTPTʐT"hh,1'lЧgIt8$C}p!,8 ^/>قЧaYo$%qDJYLG1鮳 <^Iptq zXV-u ބy"0R:7a|% Q~te=B.XHgK2豏RڃQEn?*.mm718UMC:gxV+bX }s&[JĴ)+F O #Es\+e3x{(!31~ pG6B B蕳!aR 0jTYo`=ߴiNiIOAw[.NRw{+ye Dp7<5X,h#G r4eHqF4Д80\t3GDitxYO+sybV #,lRM ,8ꉊY/y;Yay:ס8%RZT/{q?7CRu<)~3^1998|ϊd7 @J('0;_O}Kj:DZΧiq l,U,`vj`t<?X@u=E@~g089tq1bIf%˦gqI/>4a;sc< w czXbyQ`!GbL="h/o UE%"x7G`|?N$^̕]dgG!Srε Ӛ;C7`^'%$|>[˒nZ%렼MyXCa*VA iY^/UrXȪMy׶Q).7@ΌJڵ-QQE5,/bfH FrGys6߶FPMעmE~4 ,qX^:{KM>  [ jV{f>y'm`<_ei7#9%kLPj3zC4=3h#޺$iE2܇򖁚z;N8+ .MBg`2.xF{f1"VxM*43Jh.Hd>H ~m#ԑSv 偓ARKe0) ɤ *C^'L/a'NU 7ZQ`>UO| 9':4gѼrGe#JP/r'AB\Ab׻]^%+e WQ`&J9!RAQ *V8 I婞Dk|6vP<kh@Tj4vCP q`:-RLp7ܱ8_(%ԃFal'?jRl&+l]6ηI=㻬=RkY>A ! ,wˋS,-\t?̨7}Md\.m?3_da ϣta~]o^^uۗ /G͵!*p.Wi2w)sP=7cqX.9LX);M!`FItl\8ܨdCra^VQ%/?u0iB J1G_5С&gvB4+"HβY蕽4*{ŨԽf@IjQL@Edì̸N97HSo4(b [w Tr"xxp=4oѣVG`*7MVܒ 'ENxFExL/{X:0y}y'! h`"v.]8r 4PAXs9#X-AGJA[(\:Y]rT(od]̈́t({%dOQ"Haw zKݝ%X"7L({0ٛNIu qՊ[mhv}EyV9ق%8h4!Pⴰ̧(0zC=ZՓ;'p)xO&=$+n%xwu\=J:‹x}5ӆb|KnCקCha͠]}u:eW \9u9߶ڼQˆb}U<* ;LvjPU v܊RVQWHkh*i1 :Uz;Kx8?r-c1&N:Щx LqY̙C:jP8+TCjE/W=$*8ʝ7\ 1qV{tHbaJzV{W{vzx0]i[{UT=-CU8zEb_LRL$ɜYsʝ[g#f}uvN0ۑx})Ńx̘!%tӑ4;C(0A3uD2HRVa S [ƀIDk1hDk45[!-uΖ>ٺlrYn> #5/7'NUk9~n_Rd)lt}]Zn躖w]gev0jҍ9[y+hUf]>6I|2>/0w{ &KjW;]ټQfܮv{yٸVM+-oLJIQ_swsy XP˴-l<-a lim8Imu'Da_s*dZn{ilssUĮЌB.ΖB0v~:59rEB< )t{}oxiq:7YL"="0XcD$ 5b2W]uJ}>_DN"D,QFt EĔ9R#q$[!@!n.sTi]ɍ;zG_QNIwo/cvvv[ڮGUbigaU E65#ovfZ"Оej*qC1 ]>"`tp0gZ3I&Ӎzy8NcSA[̡e՝ThJTJJȍv+^WJ#* T-;(euXs704%OFӊGa׿d.TR@g{׽g` |$ :˅p,!QC7coػ%Y`,IQN)93i''DcA++$Kn#e`\ Iʕ&ǁ c=PWՙr> myYm)eTQ2j'v-+6k,XjA/U$0o# As:Lճ?R$<\B>˫AچZ%_٬fJR;t!gWipnÏErc5\HZ?oLiýVgmߪI^Ĺ LkK)jr',Mmj`8,se^n1[񽆡IWN+W\~LJ|Bya| 71|:+&{RP-h&$[[ mg f--FVh]?^|7mݗ~@r'k`Y4yجԺN5X#Uqح-why `$K(oB85Tޣ_]xjܰ8z]BUY߹hu0pzñ' liJʣP e6ed+ u$fTw I8LA&8>op`Ç\f;4kTfYm8%^**c\İ砍\_y+52zJ SS2zd<ٖHKc'S-,Jq>nl ; D1%]TY -LZSvWDvyh|نK1rM#G#<<ȘS )!`fBQkI@- n|6be :fNhj[f93ۍ$dJ2 ] =<рmAU,bZ>dmc_*f82t36[^{ 7x43+G}Y)lG[Aе=Bͤ$2 xS 9Lƹ &Qeȵ:[6(kpF'̕9ƽ^Dl` ΐF==YxUV!O9B"i3pd#<"Kq1*Li@>6)1ysޢ*qJʈR(j͑ڣ^T{n+fU Uc>Fn|<)sNH\:((E?`@ˀ 1k=ZK-S;]?j,.'I$bP V)/3=f8C 1 yXtd9UTGo6TyE0EqEx=z9fxL9Æ\ T/Pt@[ XkAsHSR3+LIAb*M4 %!\$5kpл!(C^o*$9Q%v#BK6FmA*BwPzB #R T"* wBA)W1V뛶fXllFV_/W؊"PD W}\O.FN!Fɟ" ^1 bS! !Z[cx)U,Jj͈OZU݋gkP\cGܤ "ey8c@sJ6HqV 5zIT/Q˃>t WMIAt-c||d^r5WhmMg TfĨsp$kN)#;-/\^%{WĖ-A8Yboiv3.Y_YMC׾mtB)Ռ;هmn:ezS(;C iIT6^t~@IjbD-W[@'&3v->wU6gpl7q:h@ɳ/ч>bPK-^{ǫamCAů: Ĺd sj*F%yhJIڅ`mwluw6]썹r9)ڼ\Ū\x)F1nQEd&KV'MH[u"dss<.sǣYl[zxqzs5C̏tKѝz||(Ygk?z:]n-寻Ai)Wrtpbb0.wr}n {Qk{3[tmuKe3c^kmH ;#.^q.0XW5|KZmb)9UUuWD{u̴@s^@[Cw6]]w&9W1*" =Aˣ<SlG-D9 6?9}ͯQCqζ6k;jPx lkh`H1OP:c#p=lf"Ƭz2^<ćWpmEu-h J{HUR!Q=:&ј(S}sa}ꐾ(Qj-#b*,O*eRVE}FKwz`չEv4ս'm|Qjko4i:Iכ맂Ӷ{ zW?{{pc1! ={xuN8^}7|xv<[]zZ_;êS5bFMSպsl= #k-Mx="y_c~a緃77F z!oI UN )Zw usN,xgc|XOƠ:k}R\Byup3<_§\ ߟVܮ9HinFɽȣWu=Iu?IOz~qY_,oPS]:Wϋ9Nw Χ Av=sVʙ?/ddVoWl$9y+>cxS%\n6._~Ǜw??7߿?yDž{^{g`#3wv> ??<߯x߷o5!՚ńHx|epsm!w\',i|eq5XT=I&J*+|aG :Uu&yzs[/֚5= ۓg@,_Ta7Ǟ|ΫA/7aU2;JɞPm^⌇h|ݟ&*DYl$aq6PMɪ56sM?(%qBQJBD PD֡,40O'댄le ] LRCePĺCʁ~9Ᏹ4,6(Z@ \vIp)x@Ic&ZJ|hTԬA (>f]ğ5:G m-sSXYPugcZxcНWmcA%N[yoLZy73KFU9r 2ixT P1dEycSHOK_U3S)Ѯ4UO.:mSA$ɕd8B+εB"YY3V~XTӅVƱPWօׅO 7HĦ7;*g=|Ť{g \p25Hռ$%Q5em :yexMIi1͉0:s`.N@$WB RhT0*Y-h1@+YyS^fZwkp5Jk&hjXm*kmkZse +PP);YT69MbMRj3`Z)qu"6bp+CeWhEG-ꚔY4"  L B*:FuaׇQ 9 x*8V#׈;rBRhQOLƢ!$+(2p ߭HgXqT#eJ3Sdí.FJTV*REeX;5yWP/N묶JԋU֋׋^&8*EavDg/"+p2y ^ƨ6‚[ŧЋ'sSkuy]}xAUc]{W\Bя@G?B(K>eF\$`K Fd!.UАJ0ZS97,uCqQ[+F{J8+cUR׾XIwmmWXu9?&B?*,MLJ4tlT(xQ@ߓI ٛ yU( V-#Cև$2RQ%R{4S@y#GN!uN&-:4 (0^ Ls@% _k8vyᘨxVԴsvy `ZbzBxke}ux)iG8rn6/%>OOɋ\d08)ehPQH^PRJnA`(xz xs&|p3jrnօ(}jÐ1t:EIs@YÎTH3GSLdchœLQL:pB'Z|(yF v[Fr9σLO7!.hH<L퍷# 6?jA=ͯQG }eˮBYyx|- hkl¢ >?..ƕ]mpu- \un=z!%ɐK9?BVvq 4TFO](U:쮴GO #&,,òoA,7gS+UVE_Jdԉ㫏UhZzmy(]Ey S?c>09kό[9<1΃  X2kU/uw޾XhZj&WVo7(vüقo{OS |2㳮 ʂ["on.qJ*NY }Y/,ˇ ۗo,8aN."F`'p)3d@hǡVw(ր&Ao,RWOa0 .,sN,mI SM (qN<$(] -ʡb1SIy1fLp"5\r.h6Iv1~+X6풻c*+2{*6)0^MD.tݱ凔Ϊ3 -w莹CFE6Vg3@* hN+J$J;Eq\8y !)#bTHU3 -+bP[0G_O lښvhV7׋] !ChYE|+ܸ9qA)tȧ%r6.yKn"'Jќdsk,QV8d!] x(H$DDp88䲊uT,z1Gs}ϭ>Z] )Pʍ"6bEw9lVJH?E uκs3$* AC9ʜoc0O.C/GBA7<P}*&N.` ylra(WҚ&ӎa{sO- (Y)~W>/wwGdzthuݑƫˇO߹x]бZʄ2M)M>jb w MX5u[]fI)}EZK&MWaIUɻ"_siVkD;P`ZД3m)GsNzOĴW`>_ZYtMZF#*.tqit1(=)ej=(%[ԭ ޔGG\զtHf|~;3~c6ܷvtNߎNm[kE {AL$i(+@G O #Es\+e3dDhB` @9Q*.m(zl@!jGcHXD <OIzKJsrLK$"=9KmPbw5Mơ-=y$?zY@#GOhh)pagGDi}/x^'io=@ݸyy:{1K`ٯ048o?ƌMJߗ4spaKp_LoUy 9Nގ ?x]˘lO +$A1;i_O)OΒHj?xQL+VRff4a D?MXcylatvn'!`0bRl~n^ɿ|;VBߧ|&0:wSx\ԥ!tz\b޽M/uc1OC!(<"? gRxS{%&muh_, ~3\UE6\,F>{, o.as0 ÀP|4d/X6|rܡ-~Dn#qF$ޜULi$Y<3{V;Z9 E}QIl#ei"T@@ҊsږrO|Ѱ@l7++٦Vr+oN08=F oTECf;B)TDF٘DN#OC >s.I5JJ-E\ ;tЂSH&]PH '&zD.3ڸMh+WZ4;ą/57(JUjk#fpr}QT "a?\ܾv$ǖ)r4ύ*:G}/zTW/FRq$g^(ֳ<B4U kVZ1Kͥ c0 ݪo@Q% YO}G{~t' 6?G1 br 1Ѹ\jv~ qQUX?7Y&EvXОT) Gt8▱+20~9+Fn{Cga {>fFgї\S;Ƙc|6Ǐk+*thHV ג)E 0oThY^?yGB=e5LH䬚ʒ2eS.0{VذU0fPRL$ɜYsʝ(FLt 5i\d3k侚}/堹qCƅ1_yw/G諣7J8%7/m, [a|lPr*rQG@cU3,;&#*٨>ԧ .s&p@#!7\LZ/|П3 z ">Yh kjBZ(aTH4GeYp DRa:=gR4HeqcXEĖPd !% 3Bs@T9Cg'P%?ӅŴ5粟n>]ϞU]^eETQU\0Ss^ kvjhғ @Ts} yS&(J1 AA:T0mQr8V}ٮ_tfv4EvԠ|^8pDix5NY'(b5 A:Rb@)0qZFܣD R5cYw6WrV,kԓaKmtKnnrKէW7Wݚ,˥6e\ &ł )W&aEh@^Z! <[14F o„\ŗDVޥ6DJNcRɜRaP ,%1#AdfcEP " v̺WBF!X&7` y%U0j^ \"euŤۓ: }ɇˆ[mz$Y/U! *#( ]hLxi qE#}cF}8ВR ddKʅLh})C4>Y}-`jpn!}I7wg]ߊ_ZKC>V:;Ù]ƾ61INעyVmf!YI}O %f(ƅ/[uоl. V:UPdB#}t̎e e0BDL J1uǦ,Å8. /Xͻ(ؤdhL32rZUBQtA `U.I V3WI`enXJCD5nRpbдFf簅ٕҘY]|װf`I>DluH\úxP E7I;7Q$uH5JTwPY %GBZ^tN19ڄ˰y7w$ Fhv 4'ϿLsKIGx)CIG$%js43.b#[C hZea b.CD2LyDsNz&ѕc$ G.To/#Ϫugoet&h&DTAiaF x\G+ln5l:!@ -_,ΗmVkc7,c2ₑ\3`a:E2Kbrk +ISCUaˡ,T`T6x\;OMNxscP'jTn ȳ>=M&}yWzA٦XZvEoY>e)0]ݲzm6*S7*7,o˖je>x?\.J/[ݻ~ׅoP0,ȊĸYTOy1ʈH&%71 O<}j2\M;fyV⏘(`4)k@?Oڕ%=1/ߦܢ }Ԡ}Sb^5pqv+nyathTkm:ֆvYos1-uumu㥼inJY$n,ӶeMIKm72 z"[6zڊYmjZ NȒ0$KpƏ蹤Y:@QFX+,4ѰL?F!H?%ҏMQ$f@"~<.[1pX"]m关ɸX7Z*Ņʣt>E ]3d P0;k-f!&smjMn1Q=to$-kNm^{D9ݡR͚J?A]#g ,rh&'ӞVxShxȹ\?`vUCݶ$cI,k+Yھyusa~w?!$hMN>1 eNGV:a })*cpm!Ǜ䙋KᣵFh{iL.U 3y7ny];vH8]ȶ:&9d 5hbp0_j 엵ht蘩cUw&;"lJ'S&D9Yʑ,]pgY\tL{VXh{W.7yKm/M,8T 7I :g dJQ7cSwB2 C4*1(mm,⿲wakoXֿYJ/Cqd1 -VAgW[qj1F~H8P4hfɸubz7f1rrRRΝ~\Lk\( WmҔY/~?=YTT仉g_#=&O/ҥl'c/?m?c/h:/&/ % _'!:{FH6arݔe|AuWgO G'edײl:Iφ H^A&=9Qq>qd&Ɠqo - oӶe4ldTF1-N~Y>cMz|en81Q^n^( '1ON:-/cN-{OOK^0&q<O+EN~ik bpvq1)XB)ɱn\S 5+y%Mb8:|s1keH&o;Th"v8K_JJI|. ʪG㬨I0]Z~9舔$t,Y&%!>:zV|Qttǟnk/+X?٪ם^i-ofu;ρbBqiN3 Y_%F lBލm/EZ/◛x8~J@`̈́Dy+Y<+%ùr8'ܣ3t gi/EB!%dX]+٦?*Up,pjjz4R%dH2 2ӢBd7.{%@x*W$f_dF:FP;Oo֧l<$fE^xXW2JD>Ⱕ~*8l_adD}ykEà Xkj|t6飾U+YSͥ~41V1#D : "kNZ:x-]6DaV)rnTפrB773I^̵-[&؄8c>cIbs|(Rԧ%P|*N[41Հʪl$Lb9t΃2fO=bRGHo$?R\-ZL'S^<Rˀ* x.Rf\xLa"l+幂CUt@P %. =8gZxȑ_i Cb8Y-Ds~ClӶt؊Ez|U$*%Ihoɹg UeX;;rT龀+J<'ؒs4n*ϳ}Y?zri.kaJ2 AbSA H! 0MBN*&Y:8ǣEdxŒyȎ1c˜g pC`tyg*+HFa@f|5U" C~e@oG斥bL*c" }aQ9Fb26PԊg}I{e+k$ 9,e #Gf>+%5C zQpձj2oYO#]ܴOuA3da(ܴ+lHFATB,]쵭^Fv>MmU*|}+X HaeRliwC9+d7x,A2E߫G=X6'$+kX# k[Cߊeţo6y&^ Y.*$ʣh-V[dzkEГ $@2@ڵ/ۗg(yA$prsrвhp>[2sǨr!'⊥Pwܦ)c.8S  !d %ur/S˪AeX[;B8рA*Ŝ+Jh K*f{(0KO A6ׅ/@k>р0RAGϋ8drɀN5Q4.T!/08f#-Or"xYka6Y؀9Нm^zqsl6̄,5aAHQsb(Hf3sX4 yn>u\pg-"AL''ߠ@9zIn :o(, "4>%A= dC4;FPמ =l5FA)Fyx<7:^NƃM jN'M.g?s͖6x;4xܻc7(w7}evlzM77:kWwJXs?j)Aݠ1m;"HgѼEM~خ7mxo|us" ˖.xK{˫/~'[ʬ?zOsGU"֞zK i%_޺kj3 9Uq\ښc[Xe[ɴ fFeFʣ5F)/"cv*F?BNR\\ ?hD!(KF!,`Vh#gҡYE2j.N($(UʒK%LJ$p2qD2sqպ#4@l'}0Dɗ!anqYL~^Y`1ΨXC' M`m-3q 14 5)Vk^v(z2@5n>]_)v=W%+d!tF)r [Bny"|Qr+= 6,p?/?q p&EcsY 䄶8&E2Z&.',KlBƪ(5J+=HnƔcG2ZӐP(Zwjl`  6xMHD4I_ QJ A}wU4 'Ӧ`vҴ9$дywqRV?5m٬izdo?5B| O']ˏ5Gน S4w/4k-ݍy]ɻwͻ{[2ZN9qѴ 0aM·{OaB/75_pJֹ~m'}Kgýi~HpHw~@:ct[ SQX`!\'iqY^k LKr^x9:l֌38~if_A2D(Q[TuS=3a~烹.7S%Woqp08ja޿Vq75 6}XvWJ0=Z !ΑhR{_u;1&Xg2TBlN`vFAvqm[h|L&A“)1k2w.wǥWKAϥhesLAЌEK’\~Xo^z~'-Y-9&˞uՠzsܯ%b7Db$UE2jT+F_Gp~v2.`[L ^߈Ie_ӕq)噧:RPA<& t9wu<L*3+quO5 D'5x"Z%)4'=9kHkqZ|MBUzS K-Ebb s;aKez8 !kHU/ :FpP6-#t3h.NqRFE M^Hu݉N*KW[V+WZ?ռx]{]D.VJ UjzgɰgVBE [ojstB[ޱ.Wa_ iWn&Wxf\ NQ44h3Ʋՠay}*EF" <2y *Ab s(1 ̈$ =tqq[iZwY"ޜͿO/G^3!58Icd1uD/Ms4ȝ~>7 ;xŻs 5Iz=p䆇3]X([bXwk\[ 8O0uŕ fgTd:faZ©ncV;FA1~10I9 M\VE۠o ovJ\LQ.HogA2jR:"hkc0RyD:3*Ktʳ{g}D=z0(q|DeOz#*%76ّlf^/Ho1l(oCPO0Jgax8nf00b2.i>-l 6QKsYiWE\Y=F2NgiB_+_ٗdB[ 52c{f)tlyMou~Cg={_&,sp˫\Q&h$#?[3iw>nYPHXb8hJ"q+-:p1j@_vd>Mxz!}DNVR^ħ"WyDOY܎=Ϸ/')ulJ6e 3kϴ0/dB)@O]To5bȡ5,t Zg<)㢏[Nhj4M4ƽʼ0 Cߠw 8z 1̃fO|G~*QCdSq)z| Mpm9֭CK އ7h *l}$>>=QZtr4>εJ1݆䠥䝓\KOYJ_c |NjS%>sF2"ׂT^(tsAnƲ#'Ip9bɔgm+w7 wBzxvd~af#lkS8h}jJ7nt*UhU#kgCGYY C4mpO*@" @Jl$X!eLdA=D!y1HDeZDgB?Ne kQs+p ]srsIRCu(gMVochO~{k{v=y#Ko'Vso,_,$d9ؠ1 O<;KRPٓ@nI΁Pd-³X"7a w-mI  7T<wH^@vs$9P8 ߯zHI#i"5SSUUuuS {IbAiuJY.J9RT(u;Hؙ x͆9t2w5"y]TinC;to7PgZۏxC4hb(H$ 6(Au9ΈG>nAᓧA Q) %A x,TFsFm,0!Rɣ"jS˭@1A+sBuhAfwC  P4e7q Ćy@@xi%kt {Ru2 [SQ8=z;O)OTk\@M6avE7ŰڟRhjq]|IW YY'y-m,`q|2;O&1Uv$JH v<,NN\8A2fbVr2Gl_ui/eiOq9~s<\Oi;ip_F-"z5_|9,^l2ogqo7]'P~]HP$T:ʞG~-'qE,w?"XvâoNq۲J`.;e\mq'b.%<T ~wH)-|4޷fP_uϻIڔɳz؃-a|E[^rlVoz{Ҟɻuv˚>3I83n2 ov^%t ocMLt )\sQݰ3g͍or =/ԍ\?stt6:,m_ @y4Pœ .A&'VsGMĒ4ڠe nz4Т_yd#k {Ԝ;H"rOVh˫Bh"h@2Eg9(A^@N:./O~M:kt#}}+խOƣSi ^WϠ^'wU {ZQƏwd'a>l~aV5/uc5A*[v^ Qͳ:JaXYB&yc>bw__-FHRBNEDA ja*B a@ImPj<%qXM(?$\E_ӰtOl"[9nHo@L|LrU["vEEJ=#lp9[8t" 5mR$K#)PHیڟ 3f1c{>Lj=e5;S ܣ`6'12dwRe1y( Mޚ(lTJRpl0QH# '3 CULZmi 10S|`+BFG/qriV}W׺ϧV]ﲴ=DH,LyD+u @w(*IapîHg V 14 QʙDiBЄ`k$ިd0r* .sʃ앰PōP PjApn=%,8"`MI.AXCX=B ~m9}5` {24 QDaZ-Do#K4&t^@1c v\Þ0Í>6C`~{|szpܵ]cZQ_2мRe))d*\owmכK/GEǙMRZs F:H ZP܀O4"v:&\Xa= D;Q4@*ShC8KEj'4{{ EvuC] C8LRܸƽf< #Ckvf<)>9A{e<68d!ȘhdAY^c I%Nrv2 M 稷yIJ 3^ҜPN%B³Aq- FΎr?>1?è9/֞x I61ɈrѢxqoHV(9o$% Dك )>z2+%$z$\aQ:+% pGj rOOv^J2 pt8I#YS蔀1 J$6 !JJpnpP(;Ob v4m#Z]@)B)jyxӴE 2pa20縠|Aچ mX4 }Xg>ne+-T"f}vc9ёQ D: ģNcՌp{n\\Ϋɦ(]ohv'/'oSսސ:ܒG͍$jndɻRY;v&tkDKZ[!,SW!53:"H)[l3ez|82on[=_5DZaQUןݞxn+^v[KE6֞Ҋ5ڜtçvU:퐽l~j۞n긇qiw=I_vtw=_oE5kFjC#nh+,d>C(g;USu!re,-:.oom|?jzMlN`oz֑XVH-|̀KzTZ DUtϘ.E#.&΍ٹ7a00T U6!q&8%C#Ub2 -cé_#jPeI(:ѩLA]s7KGqj\ɤ%GIfQeO՝M8Wpl6'7;=ɔ"bnF$ 9 ?z/K6k˭g~}㝤7L/>zHlOY3(BGCjPyh@#KFRʜq8T,,0byV-:6T$ӭ=Q6پ5]P_nni1%GtD4=r ͧ(B=]Y(MɃ:2WԹOy.)V.LJp*v]{oG*D; ce8dldqKBR~pgH8"M$Z;pbK3CvMwuތy+EҊp5M 'e K t xǖWƆms0Zn3]bd|VO{:-*W}c(-TӹcJU4cmuRzQ6jjm6 !=vB{;N|󷠭⺺[_e cUEICvDS~ .d0X4Mr-賛|7khabP,o5x]wj'{ȖyA7OTmЦЦkdp,T r69J{H@S pv4zeUަwmJegr#@\)*g=ވ]%i!ܯ қO*@=@(9ԫ.li6Wa~@TΫH`NH _Q }Br T '}(mѸJ8H4ƇmxL&Jm.qk#)6p!q]<H"ڃN Iq`Qgv#HD]\.e+mb ,V Ll-'T8yH[\0 ]1=^} 5iw~kG(W!%(#*VW$37Q;tR;\D&&IGxv:Tl#r|!juF%L]RQ~oi˲}C糂tW%Dl"J!ȂBIuq)h&( !hAfԫ˥{0oQ2G0V&IY֔`Xi# -:ͽ>w?6tqT(G}HͺWWĔCo-1Yc%9N+lpY#Eгqr*8O[IЪgYHIYK`bY`@'㽈>fostH h-qd@eFvLSk9eRkLF4$VK2ƥؚ&oQK[ ~'XJG~HxWZVDg0]f45\ggL/m$/o#O2}%5w_pzC&}2nݟ[k5CY9{1 Ӥ1EFgvr9s߹KRŝ{]Omfos&guQ7P0iyLKse}㛫‹Upi<_Q>=kV]1b?^ꠊg~s=F@mm>ì2kڈ GZFf40DO^ gQ vl]t"gZ,#e`ExW_M1) S 9F_ xˆ /ψ?^ŷ/+?_7߽~Å}/޼~ :a,QK$I(>C[~ekNa\ zU}oFI\UMp-|l!ʉ.LEۄjBh"-ts6c\+渄k-0a@,L3]Ouh$` O,32 0;\K*2L`D4B;BzRp)!6ًC:[dIђy$nEcCI琒THH`e]4L1hsį3 ,5[Ұ=bi~wqn~eņuXՇtu*r⿏TQ⭵N5{v tzKK; (CM'kETJYSqVG(ҵA` 97*ԉK =o4c?3>9D@%G}t&jlk廷^:.X_L.n/23Bƀ1Ȥl,帍t,*Z%]QJ)JqM Q\XL.1ٜL\eKv,;#gdt& ;CeX^ީ,\RTmvUqKf]qbkigo,ϣg >%H4t&h%bhB"N)4Z sBJ+34e'*!ER!hHFd2v̺,HK3ˌHXbwFnƣb;ںc{#82L!qRcUg%;iY,W}$)ȣ.vJUti&`9%5 Y#XLVQ")hXvFnyX+E0D?chzK[8X!K\Y.H@ hS,X㮩rƴҍQ.ke2 r3Ԩhc,ɛ`7b NW얈ΐ:\VuJθ@(XrrZCF&,:&|LL8 J ut@/B.+8Tnᡎ;aHP?~>#7qn/TYFQ;G?>P#pIyeӍxTć 9<`+nS@YYYp2:h$@ }k蘝QRK4 Y 3{`vBPF:DIk602`X>u\8-<$@$wzO*pǸ3r y<8 #O/'s"nvC;r }EL5ϼbӌI Fr̀y,͑5XRѹT^iI$V8 }`Lڔ4:εs{-:.!(lFύBE-=mY>ٽG',@/}}}B]h4v,HTX > ˬR HxWΓz)0R:3S_|߫۷W_ -7d|H^=l{@Hfΐڢ t]So^R1{7$uR ϛ4L(:G?*2OLB-#lv #p/6/Dx?ҿ&/\$E\wOKX~] xS šiR8U]zPmN|=^^oswNb۳# z -[yR<[vl~V;h8>3zF ^jH tгO ^H4,I-x{XB̏rշϿRɨC 2W\UTAUvZ|*#S˙@aI_iCXeMG6#[)0ldیf]3[f̜kkKt) {bJm y -QOlyBsD(Ѿ ArM~z0--4a\s܈ƭ灧q;h˭Ⱥn~ F/P6#V&x@KVrIQ/^ݲW¿It4!PܷAmqә/~OAp\Zd#cg`Xp^IbP%OWN0IYijv6DER\0(%&PƇDLR:LaIxޗmU)p.ܭB)!,.VjijtN2r_\\ϯrV* 6Y1Ι$B"+I)4rXAO 2<,@.~ɮau$ E*8SM$6EI9UuGLµWUe_τl4X0 Yu2}#cJ8_S i={<ÙW὾X,y<[8Q0cm7HZiL JHx&1V 2 C{Lƹ Qeȵ:[6(kv _0Wjsx!} :CўyȤ}?IHf.$.Wpd#<"Kq1*L1DJIPAx-+*#rHT5Gc$6*c$y ~ :b_ ݘp 6kizzƎNǭ"rԺdJ)mp<`lB:kf8C 1 yXg9UTGoW Kn{d`^F| DcCƶBXe#j cck@Ql|`aZS!MG283iGu$%k<6֕xCA[$l0p~Lcбm5K丈^<`" P ˕3VH40[A J <<(ZGD⦪jI#sHx"fY{yɨnP=9_.JoL tSȆոUT&+7/4R2AV%TtgԌQ!] Q 0+X5jώⲭJ FGE]% IK&Q$"/=kO(@H*+9JOQq@X*+T햡 /\5$Vc+:c"& LgA:ouKkW,3f̪(N1ƀ/CZ B u@ea]\n PA&'D="jYSϻ uئ:H^x1Hu@Ky[БJ'.  -*=``j>pLqΠ&acR }}5@i@2QӲB Q`rQR0ac|Oyq `}Q@VǃV( 2`G[.cHp~(Ӭ(al2JRpac0vnd;qP'SaA[w5`QDD_b =Nx@6DP1@>P9B(51 hwp!UBGm` @;k1-PxW3K:vU 9GŌ:hX%Y֨48>p@Zi298ksj#:& gh ` eDAo 2Zdpa,p0",0wϢbt",Uj 4]v =, f,yՠ6H*gUnKo"Zz7*-%I@ |^3&J X[ҹdJac-Anu4i~ҕ IU/OfӲʹ1F,ԵEW@7]pd3 L[ vM>;5,Fhky.j 灣d`9q=)o'=1<#}QaA]wCP^o* 9QpyВ.? Vr)p A(P`AHP#F' \ amܨΡ[66I#+dOU.D)F^,wj%o;C4k6g}ep9.sW= SFVshd2@pf%]?\ze,$+m:ekZ$ b|][s2b q8x}[AY,t2lMb'ӓ1JY87×86ãp~ >CoFux&\fo;;/9CKzt$GI^Ikggr>O~ɯ%/a ;/]?dwkKi_:W? H!9 ЮjwS.ΓTBJ-=;>;\A?'d1.7^ЦybpƸ-W BV j_lڀ>b"8iq4I.֍A9iwu?`T$vq*AvQ H΄+^g[O[S}-#3=lLqI7kFI}OXYd(YɶoUUql7QG?lE9_ٜ[j @N<]m`wp[0|.م &j;F]k9R j3 HI*9/&ŧ=e<3c ^Ŭ45RgmhMM[3y(Ii!X`Vb"-[5x7/b @(قo7=R󛽖hY#FѼo+ȿx3o0I>^`ѣoRy Z>yQHꗯ9 ?MgfaSM=7ĹZ} \y\B8gbnk;K12W/Љ,I'2%CmM)L>䕻 E<_.{ գWȠg;~w?{՟AB76@K+QigS}C=쳇3,|I~So|>uη93JԮmMiM皥TU3H ˑQ'|esLffcrwOg;xrs([/}- OM_nɯƒm'7f^xuIYV YTy;&޹ T BRFx|4%J _.1MJsc>jv)vnaF)f"1F 9v^QLcgf`CmUN@Pm1n܁+vf`"|-ev7ē ؑ3+*&;DPjoTsݎo')ΔxIQTǿΦ0NyYw3BX2NbuUf|6͛#rKRrKA-s@ܥ# V_i6ie~]p-gvl˩\)3m,>/ٰxIID1o7R_@,&Huh].o:D!q̫}O$hA8kof;mY/>"MlWofK#-|nwb\3" YE-ss|^tQkZͻٲqq[OSpKMv NQ]ohyE\޾yp#2/:ߛIj)g3 3:;(jȕ/{\tê)>3IhJ70옠{Y0mGz!S<;np6CjĉG7KцW߀yA5g. pD͹㊋$"TjQIOV'& $ Xtv2tQq:?QmM}"ugHMHt'jܬ=fq):ąƗPݪ|S1v"[!Eo9J9n1rҲ'mUAf(#}{>Pr遀}i!t~>Ep"-ݱjru0:(:x41m qB1;HC"3-Ίv&V%; >٩ʀt4H< -)4xIs]2@O4gZv*gr8Sh#5B;4aI0LϩQdr91$8rw,g@9k?@;;5M:uCy4@ML2@(D8c'*J'IQ(]~~ag̯JDI5( <::BXЊx &$EQiwZA-?w 5NH:% d":#M-BIIGщ8x6G16"MWh-:nVȨ] ?݊62]u" Xq+A"JE-ϡAH>Ph D&(5 EѩJ$Rpd\`nFE1z"\0 uh.jzgmPɨ`K8:vvu+ɳ_/}pjԖ-mbGE]4_$\_jl>wخɊ8ˠ#J(`cݻ嫩E5M4RAUNE'sUޏ !#׀ *5sH! 1`i"tJTăP%FqDGh^ڞ&}yº^}iϗkӶ`P"ΡlOߘp]iN] mih2AKG fx%Qȡݗ,8tiU}IN,>٤-XgJz&3j$n!F%38Cm ]\)K$EDI!9,ҟwO9]h7u\b6DM|`+lyM%ƈp$ K+6n JJw%_@~x뫁TXYYK(-hG[h/0Pe&<&4GgwmZ_mN@ۛwM l|jlɑdb=m$L$DPsg0oE"r$6")XSC>}+^oVlGW;?lXᠷ;Ω\Am}ؽ[ 3N8wâYˀ @ϐn=d4oFފ9:C(yT'fx3hV6`3~k//!3iu]On;k5`&jKrŹ`2znwd2 ͵#Wvmx7׺~}  xٯN蜒T) g+9s}i!nn\l:ڠ /u nK7۫wT(zhbK6ٺۆN»+^&{ԼRr| [_ݾݱ>s~ RK>ꎊ YԜw^tIe>$р|eK#|ylcCn>>Li{6ͩTw1)\]4Zب-)"a `*Mh1u/^\A^jK8%aQ{B.pIȘ+dH 7QRxVD{qL0#z(n`Z$ qj|-L%ʬbN0p5@s{qŚw zw,g Fz/y=_k}Gk~eJG$K,zhLN擹'H4LQDfsfDB3NLDI,kTUQhH$H>L,E9hH)ڈS) ΒԅzŚۈkv.{sL>:xЎ(upn|*l6nLc6B/n#Ů(.'tB Nq+} .YFY F(99(jgh\0 d rƮcc|P,F1#6y+iL9Ҙ7gA5w㱬qѹ5w e "EYa>L$]I'm L#YNc6L* 'Bkt  =ZQQ48+JX:i a B4ڀWVQ¬o XNGb(1ƤMSO,FG="s,HaC:$yWwrJ'Rs4F\BLjb>'>8]z8jlt;}69WL9}bm$ez7N+o$A&"~0x -x:=w\I/[͎n'e7F\FZqNyȘ;6Q"ׄw@ˆ)\67ҩՇd̝\^62 C*uV׬%,G׊ l ڙ#Zf{y{ m~^|Up1\8_tu>[ncA)Gbp/`Tu'}'!766tfu@C0>*{򯇃w/ٿOuNλ2Nvݵۥ-~Hprcy'>? FTpN6?^9NӋo=ߜ_8gg|vopB>+&pܹ~nIm >U_~A|OV|~t3>d{ql9νK7gr*DR0nBb6 .~wkZt0 P=7nId:ꎓQMd$rP."  I48xdEl@MLμ'p;ia20x<;NtHlvEC *2B(@haQ6hOM0)^g NN {.5'Ȟ; ݁$8S ,̡ow> e<밪(KB 42gk"@Գ&YF-KQcGqy:eI.""h 2h (R:n$F*y_3Et#Xgp(V81OǓOfns oh . N=!hc+"+J8[s.eI0`EFU6A DF0#&Q*aLɅ=v~8+!xbP {mUvE[ɗi/>R c7%4 $ \RQGΥ@S\I*7 %| E}MfB98$`P ڹXsv6dx,8#QWX=* a.6dr+DZ+ M@3#CtAPBڤ4Q Mj&|xSh4#tFKoG8^gYr_d_/Vuw_*o< Li2> +xt)9ã^fyKŴSR-_|~hCZ=kO]{[\~|Gj(d|9zgo${[ @&FH#ljIAVJ}F S!%3^'+|%3%M=uDZuV@iֵsBl|Zf/uH~"c TA _D CGڠ,N ĩL~iZ0FS2yU`)U: gf85c2(LV$ߡkn ވ>;ΚYGk_{[@9FqE,)dV]zONz+ILxdY/htF fa Se:/#YS!C0S0rJc 4^-nSTwO<ߓ|߸}u+bC,@8_

GN{h5 W6*VѪ ZDhU U%*Dp 5hU UA*Vj*)])W%ݫq?OMpJIW JhH9pV!aZb`B`7 0Bp8NM9L]Fg'[/k8Y~HQ:~EsSO J>K,'R L+y "$y;}ɳN(כ2O19H "л5 ޞà7rym|+B6)Im wo]~l~ :\r7Knv2eu30/yŨp- N>\aŶ"GBZ bZ{(?mnF$)8e73d?uA`keW}%lq$*Fo㝡O$j֛ZcȎ} @;5l5yFs@B!YpMou-I6F8o\'|d*&ph[> Y2dC`C9¹̛qFm)XrJDW6u3_xӪzπE7*8KEۅdh-KRpcyI<@9@,p\['jU/yHGAgQBp9xaL8GjtOOv~DNvjb()TJ؜/&%Ah8 &q% =VG}yMˬ/#୪E,B\Ja%%i"*E`2xMEh1l]NbAFMAḨ[fy6(VD(EGù!ZʞZE VQ#ϡA,zs`c}*g<;La^ʊ3ؤ4~(J $r< l4&@ lgY.%U^(vܫ/86r)+"4V#(QI.F\\EmC)@H1q&QB#wfS+~+ʥߝWXDp"$-SvJ"a6dP@RLy/֖rP6Pv;,A0(-(THJcQ'!)h"x&9'c Px-TP+.X-oiΡyK*Dl#ңjف-6wTZ3#[6|G߬шDH< r,'γp<xD&a"O;g4Y+zIHSRDϩQy1irl .(SOɓLwDgĒz% y_& CGSyTs4Ìf׷WE3}ͬ2oREm[v}wa^L%Ԇ 4Z8}bt BK8-3!hkA,'T3SJM!$D@uɈBlujMF{ [%ͻ׃JFD:'V3vQ7wo4.dmcx1^cwÛ@m@E8!&{94A'7-(%G?PT]LSī3ALk[Q-y aqt4ȟu'j4P~.}9ɓ?~o?mkBq~ͬȆ}G׽ ;~BN}_~N3ve&ck$R($06M1,ΧTǟ,]l'\VyG:deyb3۟Ó ?4*:HH(zF*0C GXL⿢㺓Y8d4 KN/W>O'=b}]Ƿ& CAu~=A]D2?2'~ua%!X,NbuQ7oK)r`KʔF ˑ2oi+~IJzF9jݿ[scS&OH7oj&aL͍"k=;-l̞fp2ujX :@&q ŏNĄY\[jyx|3ޛhX`fNU6Ay5|OR* L/3" E)ssE#Mїc!gmɻ0_r?Э3cGv%hlJfOjz6\xD+4xh_u&?=t?G;a]O~25]v8ȄqRuJ Aef7<5rrcu(Z ZcOΪ̈́P?ϨgiU=*tbUj1$,0GAV7%BI?hyOuXHZ#߇UӪLrb=e3cyܷ(X$XQ`>G-2^KIV;X7rʝ1`1|b1fcƪz! /rEGŋي7RǺخu_I_oK:zj 3mh>qkڔA?7̨Wkm {fwO:'L?s6 w<%4r.yv5 Դa^N3lcަ,s"_ÆfN%opD{fV[v7+YC1,Y+=0 [@K:\d1 O XR. =[yZYXz~a-V~-!f׍T9&O<2BIOxM4`|*Z Zz}:-xQk wN~q/L".{Ye!HqPʉ5>OpHD4N# n! KvOZH9(i ҒkKJ(m0:y:d|I4/ `[5/j.'PJ "Sؤ.GB 4VaHlפyQB$3͵ .L{6 Wɷc~`zsqO1 P~ƶFPgBKdUr%W1FTB`E`@5Y?wn;pvS=p_{;SzQxJ:R"HqQ'=FIB 'Icc0F$]j'ˁxv *Ôq܀@RQ*RР9 -֦I M@QKY`#BД4mQ/#/t[홤BlYVF~W4J`Eu:+@R(J_]v(p(ojWE5MtLׯVwFE5q\@@Mm;QeI%@gL:"c*^N5El:K:A B@q3'Τxd'MI"zE@/A>wx]cyX8P+z3mDbӗuwueqAbQE@ũ T -k'o|) ,|8}.jI5 %kVƍ"IRQ!z0Ic<39Uv4Uח7AHyށ)VBS9 J,TES= Mڱ]uie% PRj"QSXC#w&@8S Rpc)ujqO@qgk'}byjo<Ӵ3n?7}l*o7|z$6{de?i~gຘSr0UU4ʫVEṎ́ǰ X6YPmA8S2ĀvQ'&`LRgLzΨaHti<(Ebh18o RHF%`cxΊEpF -oo2Z2WT`G"]>x(8l_WzA~x]mO8Z<:U(sp %+Z~?ՁWW1i(n&n;ia$=w'Ksdodhrڰg3Bu| n^6Ϛ@5KdxHB>W hMF\ff;J-ۧmfP9Cnl֧yA._vs]gW^:ss 7;;;y4ȻH-z-n^tKYyלUva]I/OsY6ƱMC[)3m4oL A}%Mu}iR`G+؋[t;YD\\+`#{I!5 s&!c*'#XQȜ3JAD~IN0Z(=0"ALj8<(ÁDVHu mm"㹸Ί ?y"Xydlj_p] =}Yh񇈾\=`gS&\ι˙鈰{rdDa'% Av5݇udtstt?fJ.7e_7H/'m~ݴBgqeLU4xxpTy +iu2Nsjeَe bmc6)YoiFMUR1{nh O5DV4(bR2FNaC*T P\HASJ4Or';/Fừp-L9} *w3ȓly&fr1tY) Wt ]/g.Md7z+eBwNWmai2rgy1Q9{=\~=dSh+#:d][몸Yq⏳1o,8jn1L }N{f Ë.*O\?h|gOx:ٿ~盳WߞQNvz+| $W"4L,[-iͭ=~nV]ϯz*>,L9j ^}K}H~zOa@B.b v-qk,F 舉 y'Ra$8'' ' m@eAPiOE4Ax(əԐָ'u>03.U=z8IY4tu`u$A`وj%  9 4^i]A{ʺN':8>"XQ>qk80;’Ae]yE۲b}-.@)ExvV5r,E;9~߷J9;Z7N.HMhEx:P$Jy!﫵4"D!%_N-D1BrdD(qD*$X1$EiE2vV햱;ҙ-gk EǶP[Dlrq˦ܣ񗱟\~sBUB'Ob9Sf 4MWZb$*#LA[wsխ'ǐ=R1βRy NBPз#$^2_&X-vgnp\cڝ}jb_<@эN'p( 'TTh *p.[Y-+" wZP9^Q x"^p9XģSΊN r XǾQwlU"n^4C-u1pDY F 7`1BӔ743Y1t 2"opjT,e;Ph2Jx͛U2#눖Ί"~x=zɞv)vōB::#I/t3O81Mjr0ýNH*(v!Ѭt;[{B |O)D?>R<@h.ϧi.~Saz!U 44T)$0Ȃ.#Enk' N%.I" Z" (3ᔐ Bފ.M}=nV(=۹s|9g'oOng)95'NMqFF_Nz^Xex]hQYABǚ)ߨ0Y獍QW*ǨhC>(Ϲ.$L:yx:EB-"8tդ+`$SK3]g' 9>BX:q 9IjڶזWv2-S#U%)QBߞ}ȹa:s0|.IP&s*9 ɽl `(d9?][o+7+<,m^bfXdd:#>9[l],9j[[v.6UXuLLZ\ K>v{]<<-vlF_GCuJT08 Y(X;km"C #JN]8F$ ]hw=Wa^g=/82^ c.Ma|  &!f#c}Qڕt; "gj阪AeU[[`}X~@]~c;(kXZ=LS-Q82DsvYe<)3.e /)|ަv6 BEmޥ$A=wA'&N%H$R>:%߳&.@ 6[r]g ҷn龜Pqg+6'm%ۄ@aG7hrk፬A[S;FV*5z֯ЛVގCބ$KI@i5Ds<(i}4O(Ad=%Bmd`" )JA' ǘ#!8MPk#9Hs@Hҡ=rw}zH< ꑣ1xSْXl.)]^kob,}qPԧR 5cQEvLrkB5rI giU0LE$pۨ2 O(boQKmbvRh2ȉ[( 61,0mJI2dr9Y#W1kuz}[.3;W.^1Т y/BF ;IIYve i7@"p^ӯQCRܰYLI)oy`L$l5>)&m:Zq1Oz]JelP4%Mar̗ )[D] d!J*d1Ofu};z~(79'ZZj$l&3qs IP(ȟſu!nf.Wmi}XX :oT"msؒnmioMQIhbI )A< 139V:\>bf^ k4n~*J$J:!aH6ydڦkjer6J],rekW [[bA0TR{Jj%fȕ91b(MHmX`IdNž29t=;~پX&{7W'rP{z-ows~*L;5:z%tht(3K0 !:] ke : jߠ# HH_ 闆D Q١Y{%Rȳ))e )Tt%j8AVHz=CP >.sGJV[,`9X{1ޛo&n]E= !f1 _F(OVdo9.Rui4[/JiKaPRL>K4NЙ [dZ3sOnVUU?[]wdojP m̿uے >FgC=79TnJ9D*r) sH@FEo >ޯI8;N+AU%  J.SBwLo'V:ȺDGN. M5H 1F(rp`>j9}xWZ~j}<_A!|gQ "c1eW>110e*#njRB.8$a`݆O29Ovԧ0:.( _l_T>qʚKђ 9:bN[Lܧ~HE 'E3k>H}gR߿F&w9/B=t z'=Qx@X/uċ3ƫxjtB9zL:D2rVZi1B[1tנƟƏ>TDUD)'RΪg?\əh̝?<~.F˛p6^tˬ__]fʼG<}NymDd^(γ/|- 1J$Ұ3լ$\u'8^ff?U>bfE?OJ%g5g4͌R攜^1)uem#zt}1Z4k#ؓKJ22& LyJ}ݢo>\'u]7b[n^C2mw֑&XQ.[MӬl=|i5yz: G{#oVz49b[&Lo&Za2rn5QI[~6t6Rmjymi5?̃Uml^^@|F}|9WIik{n϶L?ּ_tzWrQ8p~qz9-.|>-ecwzpvnn(Znga#s9J p=ݮ~{5$ɨ[kyAQ7 ,o a»M켙c);݆VBDь!Zslf֖3.6YyYBupO{PXL%.Aj \IȬKNr6&&q+‹`DU/x㘂.8=3\v {ZI.oGoG35ls;f17SOj'm5Iz3꧴| T/tZ+]bOi7U`㛋Ѥ*x/0gvHg/58"xUꏠJ P/fuu"YfE2N l"$1ZͯF!bʵ05vۻPvE*AswiGÍ뫵*Z2+f6/շo?X-}/LGl[gvJoyClzKlV u)Qewߨ*! .BCL^aܫs/MpoD|s`cYkVo=j(m6rFr@~8MK[o>Oswr}bRNWҊͯ ϯt* m s'kTEjyڒzJ=z-Z vg8jz\Bv\B2D ّ⣏09lZ &Y礤_Q*+1ӓ`(Ћ,Y୷AJ4&B4gt ~]s`6K֝^`ݽiY-۲3! KTʩ:W?L(Ȃh%;'ҌM@Q@/BGdL1=|:o5=!kwYjMt7J ӴYկ#'!Ρ2F'E*TGqi )OMCf B--vD>_0pWQhc|};gȼ.z+8 954^?VJ V /ţM1&& p?@v# =$'d ?YpA$L^[%f%o؜HD\6 Rp2gmEq\?ȷybC vƐfpik iG GO_8_\wP>ʹO^wa7=1|ixk9.!qzvK5vL݃~?T6W~/Z%ga n1y3܎ y]չ/wqBz=qnhw7Rwv,op"8#@a2,zvb8pJtJ^k*fn!g]b9P}~LWdOS8پc(Q:ߦḹNG𫱓Or'\ڀ̂ 9$D4AxW+Qy{;7RճctXd:T# _9Ixvc MHx䊛DHR ԟ99@B:qH j;` m{U=tC;o;C;4W @+:Zd׬Kؼy)R^Js  J1)C䄤L !K2'!eа@&Eh悷[~J(G5$,LI(éXDOşC⬃Rrɚ,vˤQ8wਥ;(.$p6E'jH]h>3 E - +V{%hk1 sY'j1qvO&q{5EZ١E6Xbzrh"~$ M=۱d:]}}ۓN؇6?&wN[#ݑR(*SMAP<&:]xnR(]RHmyĊ EFd* Ā"DtƠSV 6)YqbXXL3BQ E$}z~qGbkv3Ѭ{*;MG+Gln}f@h̨6\"1Br N_"%,e%Ð=hdNͰɤ;Ƀ&&QA #v1qv#Êy,]L;EmYe=ݦr,h M8wU dIGVֳDxT;-8ZўEX"FD/&.$G:*k a1qvam/hg`<D,".Gz/ה0Mn˱.&擈23Aa!fJ3)EDrN(8)CnyQbQ{7bVѢ0"g7"^G|qq( tL{Ŵ@\0.{\ܺ/ܑhx*!:`S$0*+{\| \*ygLڲG3R@g'odq.CNQ& [DphI˭4$KfJPe>WNƉ 0Stbdf4v">]'05{]#1=_B|U*˟|n3} v-@.Իo)"qj-d Trb*AeR:9xԊ,B/'&}\DlqiR RXjAn@f,y:{& `UN B210)fM"f{aΉbe="< "d]nո0~K!lJcQ'!)h"x&9'儙!@Z~#F#ft044C&wO\ Zxfjv4[xC6 QH$r+s3#ppE}XNOBN)$0xz)<< ==|m wJʔ1)*V:&]Rnx,PwA"x'x#OtQww\@{_9W@qdQiC9bu`chxv?x0hfIJg}Y~Lɤ':蘳!]B})*m n`><OG3Uϝ.>][M_yRպ} ˆZƴ ekoՎد.jW+Vry2e/Zm{HwgjҀ٠)t~w_wY'qZӯijg?O&ߣN鈶C緡iVw#|dƒ`4㩓%N6WS8 Qw4 LnWf|vd^B*tTV/ l /fWJ "Sؤ.GB 4^VHauٔW@{ I1a $ TR$2$9'TIEX*n"BWkږ-Y->5!|Fa|gL6Qos௓b~!ȶl74=PhڀV3DT\$QiueڑHҖZFʈ~C{He2RB(hznI$3d|4XqbglY9bMw$Gĥ BI?d\Zp1 i]Cy-}?Q˥jFB'I&JRX@'@τ+msׂRk$v@ykypX^dɳcLko3`x;yO_7a/,;%,\K eQ1KcI[nvt( :KNR<[6wd޷/4 NjSܢK>Iz:ht^y'&N%H$rb:%&n!χw [=Nxg L;7nõA;>C!<=S`ȰwߢlS,*vz!&hd ښquT9kzϡ7`zGJ푴|7!R+cN`4h `rN%-L@#yƸ*YdD躍,0M 7(u V(ȟ ߅u!.F.uAwE:c Xy+۠X6J-)s5Ip4Lz$Sp{N<`]8:=A/Űʯ}> ! m/?%eO6ydeH_tl2('7je=r6JLr)ҠJvhm jX{ ,žUTW?wAFȕ9Bpnɩڰ :H&;ZvS{Di-灁1#'xEl1@VGGX$[1댜-QH9ۥON-ufuȿf3~seS/ٯ] `rgخ=qܣgBGQ|Rg\%j4HBNZY2_{f:0n׆DA$fJ"Ϧr-XZ2Q:@(ME ~kB [xB;Y PP#gBA<!#Lb@9ΆMcЪ XXk| Xmǚ'6|_ɪQײ/G=i\]'EZ8UWR+cGO7k<,xOJcuyT ^FARXT%>&2'A̅A$2qy13>R?>kta>QѵgGVno 7 Sd4rrR eNxI+BoH>E2]@՚䅳svPt fopWڽ L.cNpzSMx-d,=]ɥ1)֤34 ˞܂Jz* ;NEwQn9ikڬ ys f0U9GdX٘F."pl8 91%[rZN,h%UG;#zT%ƍ@!$%NlcPc"Wb2z! r/LHԖnnzJU%ӧ 2TBUꔬCx\5fx>Wոޒ,z_8ƃ+^^7mM4;sJ{S U\Ktm?ܒETU=kv3<&7]-XkMo^ҷ4oU}Vnf}z~+"dկiv_N)+FU+lbEox5E'a'2 tkgi8jZ5؄T^ bC@dD/U+/SE]3ᛇS` e)}y.r/6EV)}>8z-);1{w2)*z83?'ϜճkB I'J8RHrow ,ACs"\/=&]c&,fO%o 7GG[Iɪ\C z˧ b电ŢcYbXb$p sR5EX.1:5x19ff]t=LulS_-4 <W$9#>e rZLN0B0f]~e?7T]|Bhp|" 9ƀ[J'.geiJoMVW*ʗ/q(QUp1pnay+L;k3|X)bV$WvGվnyJ"SJޫkUF٠U$p2 3eldl_- ~@[ejn~ynAICK7wG]VP:Փ#.ŃuI YQPk'}'˶{pՔRχ}2/ɬ—dup#/!E~>ts־ fվb,JѫAlOU@_T]иbHԖUO"z>{ί{g 5Bm>{XB"^ M(k Zguϙ%3z+g)CgTy2HEc<9jw;o烋~S[rQtTN`:q5jK_@xS}j-c\t-IQΦ U֖iTxEd:g h1tg}!ZqZ_KcM9PB{ }קOru}WlٖQ~ rNԿ ]Ō\-϶&ZpOt@v8_}ϧ#w$_t_ġ)*F߷>,ơyd";Np`wˤ)ie[R1Q"ɜ%ItDq>Qgt #}j\L!( +M0,AȄؕSKv]zWsWoTC>~haOFƎ~Sdǣ07Pi6ռP =aV޽8?aL Z#5djMy@c{fQ>г-:q:f2;}`QS1X6!󭚡}PLb:S7Fe$+u@H7ZH2џS(B)B3!g𲩗^6,~=-G-N#(w16f~VPXgP4u_KIgl3Gί6zU\-&\͚6I:+z,Bem9]ymHxHۦXܚSY?ZOB. Q4ti9=.߼gjȚF3n~M7hAl10779qs<)nL<;w>=Y!11ke[K?1o$-7^nvv>EVc瓙!v1{Pä w|}6F!lQ 0=عW#ejmF?t9[ijv_Fv_SWw/5SPUsLY58#*lmeNW],Ștyۈ(+n4]00uu8*p4Dy\F` s.G tvm׼]<Ŀ6*}톐?Y`yE5>:W]|K8jڷWKOZej1zُ,MfgԊ}M^p+׻qe\}!}xw cǥټ D%_y'oUT1c>$@:O<&YXnN;OKޝ'`4?^}>9` ~_۫rO?f=˹ìn S8n1Y,G_ͽ1g%N}ZA6>hJ Xf,Hệ:ťw>/}'rw<|)WAv8vz5Ѽ] mwV>?6c.L(COl&)~a%h''x!9Mke w}hU8_\\mniK$҆tWH !|xd'ι쌖"uڤl(ysޢ*qJʈR(j͑XA)Ig6;VA o`®^Gk2&uEox(RWN8: @k!2= CzPK-SLLj()F"Q M ä"\-@@,AcOp9i]L2%68ЇJLa6x!5TiO!rCȘ"fB4&Yx(!ՑC#(.Xh吱N+2v[Z4`,|B:`i*>vFaPI;8 QT,YW& "QTcpmVeқE vulV!Vj0*Q33昜* / +Q0f(JA5̹rtP \rC@P*ؒ*` 5}S[O[TuLr抵Z!MM(xl4vcei,> vy9bxDj3x+)f 1h a!)) yx&PWMU(`GyR5FaAE"I ݄aK$-0&lJ80)8ЙVNecX"6VVr VQ'J[e@SѝQ3Dt0GQF4`դ=ERP6ZrABj2ZS^vOJ"FMh/kDX\=նsVQc%Gii"j8(S Pe0>j2a+֢ƛ ~\uXgLӄa ~6YvՋ~2mƬ*$c HQIbYj9rGy}3L gت3jG"6\r*䄨G[-kx2AT瑵I|tx #/ f@ӖtdnIKDBKJ7AyA3 vXmAԤƂBG1 D>@ $5-+T^eV***6VYѴy Edub,C>i6yЎ2vp d`: (N6G!cUmI1(#kFV8-5* ΃p:PDڤ pd.ƢΤI`&Ybm4 '+lE]("\x>W+f5(N: ѢaT( ȈOZU:@)E@2=BmX)+g).XZ#eBFZ{&z@pkO#r,%&}*` WMIB1X>ܳFri\U8/ڔOh/t Be_j;J* Yk0m@ I,gfFZV2Zaԋ@+hX4S"qFd>8O=8FiV;ͤ>rI"WŪR盉0K1:Fff%dۈ&"0"pŒSt\NPeۥt^\רvEC@Tv"#DoaL>`j?.z?׋Zxqqq4A,lx*=L ƶ:j;_7~ di\{6p۠zbޏ<-N[,A%^uj8fuujː?m9Nk{`n%8fyfKY;t{>cW]yxE5W0#Bc`0^p+%(EaW$" HpE+\W$" HpE+\W$" HpE+\W$" HpE+\W$" HpE+\W$" HpE+\ꨂ+k/Ip#\\^/ҚWߣ; HpE+\W$" HpE+\W$" HpE+\W$" HpE+\W$" HpE+\W$" HpE+\W$"1WI$R#"jJWJ#Ip w$" HpE+\W$" HpE+\W$" HpE+\W$" HpE+\W$" HpE+\W$" HpE+\WG\rWN\U`^p%< GT$" HpE+\W$" HpE+\W$" HpE+\W$" HpE+\W$" HpE+\W$" HpE+\W *N~=o+z[;C0,t @S)7RsZDF7g>:Q/>uuy||(-9+mn#GQ_{K}(Bz%lv>f#vԡӢ$wb&PUU<rIicZ4gJ3V%'%G@xY$~^+,iu1m]& sng(Sڂ%4n>U63-faFcD"XrV< r70I WY7g2M 0;3KiJ_m&/M6e4"M\b&t6U;~&0xʠSl a۾ _͸)fVw<=ށ;ȆE;%M0p$`fb&!^Ɂ9Zw4m-Ww K=;[˪G o Zrрa,:B6L[eHX\NYߩ\nzb.EX͢ d9aѓ+ۜ@&2 ^ls}Z<@x/+?e7;ޗ9Ǵ,2QK0cyBf\IFU$BIsVɹjm" }V.F˵}j'Sg<.C\ e/kMgMtOw}S(؜=?OcM9,yNK |5]~bĆe>C.¡0kx,>**ٿ/#D[.<eo`/m K0ôh'.{}u;ik5N"rh'^K9+Ӱ=/gjk@&/ͿL"d1Z~6c|`+YIdnpm/4˴{AWȫf1Z$\{#S-w54f6|fuhݻbT̮fd 7{a ' :r+'S DsR;\*x,c:Q' ]fzQtFH^#i}w/ Eb?_Fcv,ÕdXdJfL*eBHJRT&#eYܤ#!i5\~شZ>ߦ5+fR ICcq Ҫ0[N%/tcQB!.ΠSζٙԜC; õ_ SgmIu Q"Ŷ<ʾͧZ/RsД,yoqw/_٩JC';8V>Vn jvA~ͼn^~tx7ezo]Rd N=o#:tT7kg@*Nko5iň=IDmp*gG//|77QEQ2[˜]ÑtFZ07c[iE(mb`L#;6ݎrMJoW?;&\j7(BhvJ +!kQ^$J3pZܪH)KXeSFXtvۭ ]3 QLmft C,8A@0)W6;`{UH2E4V=cU3E+`噐\HpZU[hg* D0"ޤ6 4y 9/Qė^ܺ"!!cB d OnXLW o8*;}L2f} ֘zq"{~eY`>2t7$:vSJzI_@f!]IE)ZBaQ5̫2N4+{}.P%'G4zq(7m2DN/ wrmkQ1}!.z1DpMÍXEͱlLHSg99#pX*ÊnW'8eP4jYgE~Iw֫s{ҝ5Pޓ gIb7[ 61{G\AV,,0q#f%"T!N(S%K+7iHNt6a(^uKԎQG[;f/JQKDBh4Ir&QDs)CI9sNFN{N)&/ḓ^n`p6A=$D^$'q6xgPӟT3or!6pRFK OuKT73̛Ј7vRa{qYBB>Gbː ex)1{tIN7tE1.(t8 &pKC@kZ"m9L*ォ<I'bfnU'|ve=oɤWN8}_\tKj]dB _uyyZiۻf= ɒlB4FM`/qV#`gc}='q>6q]3"7}(Fd +kT;S**lIl< &$jcٿ/#xrp^XtĞ>ç~|XRTJUƣT;J'\fHjC5,:<\h)!@ժ"'P}8z|#t)2e3rjyRxXֿDKK4'6k -'=`r0lebcJ`v$.uC0? S{HmMdUmջ"3/wä(8t[w)lb`pqR1N&y. &xl;%aLSJ"u"{v&`LsĂ0y)1q>ta_1 21sqOLKi+tzEDJ P4lbyO捨j_{/,Z+*q(ܿ aQqe 3˄fB~8;j\ 4 R @QʀJ0T)i 1'LqFM,jE*p< n~8BYuB٠1ƥ$IԋdAF+:DP HXF@*J4#34Q qƻdEZ̊wq$~ ۽X<ц|}",3=7=DZ֩f$ Q̲\ӬP-8 Q*Nl qeY@e:Dn*Hպu>q"U_8SDv1T WMIT`]#=_ +M UB54M1PE!3 Tam'2r،%ˢ)$1:g/NйKvҋVBG1_ 0S'JLm;JQxxn&ۭy`TI{Z~X{)5; !baY-gC^zfr, 2Us!A";uK Pjl/Q!Hkoc#|FfgFW=cRԺ6qgщ$e<6[UzL~~OFWV|2 [`ySӼOBFOi7'B/zL# < ?ڞ[;ƸBK;&&VrdECPt[-t1&uZ׾l>>4,c0R9V>#H_TuzYbs0>'oiQ __5KTТ,ϻ}a!EsVpbWuYzm4asCO{)z\8Jǫ)91eO'd P`2!0:[($(ӗv(fNlL[Ms&A[m[b+Ch錿1'V; 3Ej]C(VxHCW0 9Zfp<$ڠu`W\<8J,动=ҜW`uA3(O'0eXp^i3LG tI逡YE9{?m8 IØL~5 "A8uH6#ذNg}ok.H#gfβr[T^W+bf2J0cA/B6@d퓔X2ӠzE*fOd`E3+PTs_ @fh ;bzUS:Pf0T1HeTI*'޻w/^Wjj}iYB,¢X"#u14i]QM$ ,@`I&suXcO샳 ܕ?4̀z+uKMك&dϙoߡY=U&Wn4t?Uvc *6V叹yo}}Oa|kӞ9.,%P eZ+R(BUN;a Lmƶ4bpzȭyo87jzYm^h!A 4(D&%T鏖 ll&K0*ys(huHk)ҝ$Imt(޳"!6X ĈNxQϷՃ/ oO'kK!  ̓a%@IFR)|rTB+$xl|,85QP<y4?|l8t-J14֊ŪV ]' ]Sފa؏9M+A}"cܘ>E|Vݿ6_2Ak4p^M*"]"-.beo  r]*}0NNH <w/zIQ#"GaccVi`ːB}.S VQ_UL S ?u!6 Ll6VwAZai jP7T:xmÜ+DFz 34Te$QlYٙVFK{HHi #-1ԍX$W=@cLc$ǏFY>-,I)=3ϏN`EP:a笾D:yjl.k6`P*"dOpg' cf6 Qۿ *2 2 \Opg!BXZ 7:`||X(r" oOs}k]{nP`3 tw":Bw J\{" [ߢם6I>{#hDW'@'Uy ^@Y|o!~qh-Fr>Usa'gD<@ͫY̖wP9CMyH%=5V j&IeL}Gzs?uT1nC]x^i>Ļ[;Ƙ͑AIF/ˀWqi^<ǭOyכW Pig}$ƚ(dĂOz㭴R "btǾG c<5"+xom%o G_lDʭa1o ~ڭ@,x3~4‘nwZ,l]2˫#\d\~V.Ynf^s<_v_ xg tVB`c_h»srxaƒz.IZT_u,23w5јV.u7OMPeh1ttu~PV?7lS)djN+qn"ȄDaӇT;ue[BͼLqH,f'ӬrdVb#LdM}P2@iyx֪\8ˌ@/JGOH]vl&3+l$ϡ[ok=>;-I5< hqSտT jz|\XBA3QBE40Ub6flt7AfXJ`u[;)|%熴q3NSCAR"t쀫 OY1n&hNNGav,"9jlԦ6 >. &wv8,`HiU`+L(ʼ'z S9`QPȫmvOJcWD&9,Vv]3a1 Z֑]O25ڵ)\  vXXEo2[tmv~'?𥘇U7Q_ C뒫`pLIJ32axk+rXHE1KqvH?lA7*ݯg}N+x7XP͒]*peb1 UV"Uq*:{j CV/v\zn'8> b )JB\TOIɰE!d 2@vj-:(UYpFR~dxM!$-cNSnr*#iykVbcTGYH(yGt c\S ;%)g0zע97ݣ{aSX?++S5fhHVdDu ȡ'j-߿J2LؤGXΞĨ"c$ZEB#1m6OSQu9_6S㌞5ӰT>5)vQO,Q:8qweIzٝ!w.ưBb$tXIRMU̪F.@Z4"#pTfךs+hNk mo8bs"[nvka;*>DKCM0;!n{FK!R=<[Ù쾫w +X O@#=JΣWu}Xc4ԉ&HjV){fiXkf[5iPX!-D .] 5^֬Y"pYv5iQ w2tD$/cpIXMR1A 2m޻ ]}Z?^C@1õ;"P%1&3vLGߧ9L.m3^qO>}ر-nFqv}5)<9}]_~/>Y3iN0dinE8][(YpѸ{"q=Lr=r ~.9ǃue,K3pe G!o ́E`NYbh橔90Lo"{26Q m@6Oyµw SAXkIqk)21*7cnM(Nݲ3+3`M;t`dlO',#d_"ek-g9Z :6:kfjJVˋ Q*^e_a[u 圐p]<27{0%i 4M #jbt+[k`1X{[[ҕ)i{&U#lw{c$KiUJs2f&]M*5w^Hs]r-̭RH<:ykeR<{#+l"=ow]@R(iơRPus~TM{s&M)?0\~f1"v,s)[JӚ3iI[jYRjp?NDmT%#zgIEѨu>g@:{921MII̜YVgB癌h ώyY%Jv}Քb9fxkueRk?PިY|;* ʐckRv[GXÍW"iJ]Ґ];"VO_BW]tF8ţ-1{v6!w=9rT9'1gx"JNrkA6Mlк'>f̺)k"an[;TbDWBT$dg..&5ur&H.H)mQ QbnTMnʬkM`~TC )hfN7ϸ@M3k5}dQS`=rB%u)W֥ h/, hu ڡF\k}tg=F-wY?I DZ.Q٨(ߪQG\x1<=ꙫ3Vw鞡bOʢ/j *zlU 0A JO((#Ԓ2l-lWM֨o͛Ywn TPn(ý⏦L5%;X`x*U19EL72 `Ug7\+9pENZwc{5&z?V-UdSj/=tmжC%IOj^mT;WDg]77ʲAv ʳB ΊbO^8FЎ1Vf%Novg!ID~_\2J{(FQ'(g\HsZ#nd6 竀\gJ{[$@uz6W&pL-XdA$"N ؿZkQ[!jF%ڬea탩z[0Czշu0vA5\Ek/M Cw餫8V! j<>?.X7InE4,5ĞôUG~:cSZXbXƎ1>дGzH~qLe'_^4'xLfӫ,h;hifƯElHjc|8&b >TҬ1Ɵ׋lzKIhێ9xUj!?v?u>cƴ7ħhMUYqL,DCT>l;A߇d*mfE$)a,eWe_0sdb1g%U_ϐ-@Q= K kfNCꬎӪF=s cZ~gam:y-԰{_-g%aډ :sΕ8wN⅏RKoBkZD AyJQJw{ |=Œڨ#6(Q~ʨ}i6h>a`mv|eq&:rݰ!fBa`Jp{>J b$}XC/|4OU%DrYTv8~_ΖbW_'ىD Dni-Œn*NOxyP#G pFᘱ^Q lDRJƔ ?U鏹Frbz @FCD Q(,v҆昖Egmd{"ڶ/'ş #] JMYL&qt*#P)a^&V 1Wã)p t֪F?vyV[]ysW~>+(\?OLsN/B1)/~r6b2*h3giSw/v4$9]Gds6Wu NdZK:O~~v#| PN+}>vd D/x2`#j5keAiiLN& wMг{2phY:7 |QE0TPQ V"tFՑFQZ0Ω)|LD rkD0% ́$t1X#]%-S[&* -5\& ^#֖Ƣa 8{̳9ۉ- -[*NDCpԅGL@&]P.h1,7w$OAp4kKU,-m!`}0E9p0*(HZrJa\T{ٟ`soRәQ~ F+ӕu,ڳ*ʮ-D֠dKݦ]0 k-01ݢ&!` Mq%PI'๋"MNt.yRv_Hhpxg&`""pܨ?0@e (dpo:HK!C` lQDPw V;@EQzk6py**LHxWg{Vh^Tx.&%iP}$'p]ʖ7kU|jP ]0("a £8( vz.O6):᫫D0ى|3fA:; !Bt*.룧=8D"5q = R8nb 1Fo{z1F{]NꌷANˡkk}6Hr\ **dpYn1KYGYj~ ꅡ}j3mj7tI;pDš8k ꦬY/-9U #cp p:oro5fwiC[_f ' NqJZ*{jЫЪ}ZPҼr `6'Oa%V\w" ,'py,Tحq01J(19y\cZTRm}p + as ``y {cn~j D ,D!SLjrB@Am)sc+_QFITWImI;=lV'r.Pe0ȿK\#]Hk2=x1. Ftz6.^Ffg˯gWxZ~i*_dzMH<>OwqUSF@;I!|piy}ov7hc1kD13X8⡁'\#f0Lh-ؙo,x9"[?^[]wtψ9?{Vdf˘#@cn 3mv-GI oQ+Ky}uӍEVQX< Jb\1zQNȲf}OQ\ˤՏr-n潖8rD..f:n܅A \_/xz?Z t[2iY%vR}5z=d73Q;OMA~ՃU\msZo?!. i*^Vecw>ҰYE`ӧiw3\O;\!ԍgxVNAke|Ԣlw#xӹ n)< b/^ 5u3e`Z6%pu/:!qѲ ?{[/=n\6S/E+VN[gPMNj2'#N"wl=¨я~LJ9S@;`2|,>ąfh1v8sL!59N ;|~--,IL%;I{or)cV}LG9>wjL'cq+Sz!~`MrC?C:*׊=>kQMܮZӞu+ 3˻>DO"~AB#d&y(J b(SԞO: *%JkA,9+S>lI}X+8nC¨СU}=3EO>z*4v6 (B#?_]!%$eppC~ q!hq|aB eVEwQGg)+"1hЎkC6A4%giڨ9$%"xZXBu4$|Pi\)C;<)Hbat1JPsQmVX8NOfB*^o@ǀKt2 t <+UJ1 - ^A$nҌTtұ_L/U4(66JͅЪ@龁%c,3 Nf!"U=*K \dZH#ť IheL?.*[{`yA?U]撤쀊ly۱̚$UܓR Og&#`"@)J*-BlbZ/`2NЗ0fy,_Ø4}Mg$X?0|2U9866%:'יIJxಓ2F/BB}*$M\+eƅ8z #8dܱk/ x.UH^$% M˸>;iXm.t:5: THƘ.h՞'ƌMd*&Gr7Z`GGKᙢM o䌩,ނ1[xx EEyj)լ:M۶٬}>6>ԙ*hWc]r(Qf\"eƭZ;{l\FC-4iAoAp'Ԕ@㌘R Dউ2-NTJوĽZncfF"aq).GZ 쀾A[͓FuO#Ȏ ~*ܦB*$;0/ }+H ڤx4R-=͞A QA)XcH`OZ[&eEx+hr12@TLlG*qĺ" 5Rp:jA[9}E $+EzvԁhۃzmA @` Phai8En>˙ isɘa8$}C6 =4Y/[zZ/K!W{)q[]eEӢw{uu.!徉ܷ EB8`xBcWĊ@>'*cȥ|$ mKI><߄_`}siGsM.(N(DXbi)oCLyqk´-N_hZ9?L' i '`WzΗ3!(!%\av[ɥs 3 |MFcNK`Qzȹk*h`(ζ$J%*hWFEȸE#GK) H\d(M'Jb-Gwv9_\Y2.W)12=;q#VP;_~y.R]j_y&ieQr-ܻ?L'.z!2]xp!(Gcxp.B5מSca}bJތx4|Y@w)ۤJRZ*wMozhqW=28 p+WL cB$Q\,Ed t3d*eYʄTAtUɌ;4*{Z lɞ7[#y}f מUePHW+Q %FJKPf}r%%[R [)HDft<_̾U^^~QMU o6DfܵL*|/ؒ;ad(o#c݇fflП:P,nf[b5}'u3 Wfm4\($dPĠOA1`QH8"& Ч6pM-mD(g")}0m:˝-3-@<.%@8\?=[yJJE/o S}f!6=UixxBsQT2K dgq|zx s ԴV^0 9o)P?26 ;*̸NaznL$De8q>bL.uUTPжP;xUn"~[87C9{A s1=?pt㌅!=g`r7< VQ21cQD/C )RX 4wk=f( Y7EE^WŐ/ I1$/ ./8vRrڼX($%Z%d4[wOf!m}ЪΛ߷x1ڵ 2PȠp!+I. ޱ6 +/RfO_43=koPּ{8 mpD M}Z}; Uyi<};<<+Ռp;`GւKFˬDƕMp8^Β8xQ0b@Et[ 1i7jG޷VC۴*޴Ug烶̴UYڻv#+gfԋ{PƗP*R+'J)_%I18Ћiq'QLG[*Ux4O&ˆdSMWN󈐣\Wծ"Hx:}Ԥ*x)U<~ {m۰*?i4Z|%q?x|XT?Fp詡/ /K3YMpLLajSB0_}@UN랜"%lт*ůքOѓPqbkweK'ru؝b.t<~Z-hMZ蠜 }->y=7w  6blbŮWE|>e2HY'* 58oAGh`mЗ^b(&E[4)aaX\{lġmAD=9̙~⇃gq>dOi<gW vd3(e0(3 ,dTGsHGdPTtnЈ:Ոb|C'z:zCaFzNt)? ɰخ \Wj w+ɫ#;`ɠwR :;kWC{mkkhn=a : B"T0Q}hYVCRwT'e-*8 K?D"lN! zoroUX,.DYv?x1rce;bs ̷YÈutU =_`ZŚcA^yqV%V [ K}I.}oJspM;gs^ $PWfݛ]"5s7‰"/b%+lZb4w;`Uk %]e6D^Vu>b4Yv<͖N] z u35>HB.HWNgL5( %uЮ98b$]\3GӊT+G8埴+vf`UQĞnߋK?\o1^ĊyOHn,8WJ^v(J Ÿx|~xWfr25sW;4Hy -Qu]ʮzmeW@ɗR:+us[eiEjV3@ͯɧz^j5s VM`].2t%s|}^-h9!Z`Q)x2Y}\^M&u%Y}R;:6n" H}>\OX *_3%K$yW?RF]n {eqpy@PeoU\kDuP&MMoS9#(S/RBl2TAcy0b{S?RöMO$ѫ#bP󩊭B~d˹J@ڊ-~WάՄa7dHAn ]=wCU0֐i>JC˘6ܻnz>viՠ涍Px z('ȥ2iCi%L- E(yBxCc@Bj]ijUmh,ۖ36~un&pf^C[rP#,vP*_@C YPɴUN:n ;ml[bNw Lٙpp%#a- b`ewP١Ie`)D+ w[r  \\ {~hmD"a mؤ}Z=8CZwËB(.xߣ?*Ī*q/yeo,vg53i,xD:!,&KU(r+P*Ml J,=<2 PTf"i45n;s8'ab\ ؆ID1U `oӿ@Q ~ojI@ 5],;I?dvv=]Vu|i4}uFF>T_ NRŚV,dGdlIdck)"zjѰ+`(S;OF,a7r`J\BK94BA% ܼ7C*{M HħEذ D}ao[6YGۥC/]PL8wDCk+(ygw2ŲV)qZ2n-fW̞ܚ`oާ8e\lǟ&{q30 օ|jmF^&u6xN[$#E[;Ͻ{ŷ+֖+i0cB{$0kkX.-a [+ʝ^ {ΫP4,vsbxï ʅ(5sXsDJ#Ƽ` ݠOeCf-Ъ^C݀Ԙ"Ip{-c_u @).=:b,k5$Tqm #~T=e$PH,i0@0DuW *tפA=<9zF䠪X-OZCj)gSW8 "h.ϊҙVP3(Ѫ~U'e~EdjMEL.s)27dkQMu[뵨#JڶU "j[`Q&U pyXg@hޑ3tSo a8D'JeQ~͟_M6tΰ\޾錕뉃9tJGZR:,gR,Ϸ $i7:AbF~̴l,˛oq҃hZW ׫zQLP+M<](h4tyq|'j6Kx>INWWAK "BqoYY3/zٶLܾzC,aeq#hN ph<˫H8B&4tqu}4ɢG6bn6,* 7DS!Az(9Hz.=6@,1 8<\>Q{( ^ qui۫u8LQgϒ`JC55+;.52D*|y,ŧj*2?x!-`Uo?_ߝD Bkl,u]?>رdzל+Je9韣fWR@D.D9kS2mF;o>NUy |҈$II,SskXA9"X`XjQV)ڙ j*=9# +ݿM@c3rZwžəIİY @qĦ41&N17c.N ͌6"KAxayNa`0QT8G$n <i)96 [ړ,KD&C$%X`Li"V H g6uLel{! gNy ڋa4w0JD$~IO@ s5~[<r6"F(+oOW9X; >r~-?^EkyvF20\frecQh3b J#qV KlZVւ|r)~{:&M@q&Юx +Rk4\WR `TZs,iY$ T\HDHhP7DpInN>${Tg7eX*!x.EM_M!zn4#JYA$RI9N&Zl5։bcr(t2UZ%E8[eK8OD v J3p!Il/)I8:1ʌ+d $D!,SN4!-S8Ord (L~4W w3EP"cDgcu"1RD b/ۣyk:<3"4MAͤ?OnқCM-d2b'Z`kò܋gX#UΘR+5c Km2 0bIiJed?ZjPN0jqٕ.""];eMԱ #DR rQe%f! ʔ0fI9F9Z, O4 4\k$z[IU_iM8a 19Ve aYP^rd@P JssH}nMڌ' d7 XfՆ!fQF;-Gf$)T(Zf2C1n"U¿)|o2)/jb좀 HwErfA`ULwzƳnq"7n:~Yݖ6yUU^xVc#Xb;WP=O"3N&Q]F))08W } 9q:3_O"ry1rϕm/wMɞR14!8'Ln[cdLT.s:Z!3 ݱ&xV;P=T +)LO؄m9`yRzQK?o};EѵwV?Y[E1\qz5*[E ]6D|d6#5GC(UxRQjh&%G "\1q%BdfOYЈ0͉N3Vt꜠NAgsnӊo߉N1Ĝ Ξl9)9K_ݤ,'}+Awvzj0i96>R u!?vN!L!:CUCc"Lfh I ˜|mf'O?\%v%E59Ba"0>9Fn8w6JIWn ehatŮFT/ChGuJ'r+A2;{w|{2 O7 [T314yٷ #M%+13.%D˪êచĂuŞFp7IӮ3gDGDIl2)=эU*ZV5vUhg82bp$Vݭ#v[fw, <dN0wK0:`D%tH7 c}WCS+V(I[qS2v1|,e"b\H£ʝh^;_MםWXĻT[J׌#F1w2˒%'^}%K}[T9}T>ԍhK~5ti-zE-=n7f[G`w3M(m;sm -c晌MZ9KUipzAp%N1_,m](,{74} )hz!&4dp4W(Ph`KEAdp4 m K҆VwjUJ J8)׏|``[d[""R2'1o)96:hKCC%Ckn dU1%B]F֪QX_GGƞv-Q)60.{{X4 zUnc{J#+3:3#aÆ(#%+C'NrXF֍S 4 F3Q_ZvS ɥZCs+cӵ_4T/?\;`r֎LT3 ւ'ɕpFTӛ]׎$(R.7N` NKhAC2 JLQDP9 +1fi/?q8bzn&ftq%v7{CM|T*0l☷D_y>M Lo{5X^۩!5X sfy&ٵf9Y=1]5IN o^+9B^Oa5Ι<<$ P5ل{=Y:(|Pۑx$puϳA·ק69`{"kIN=sTXW}*Mkyu5b{1tI-]:vqA F(,GYRkԷKfi%V8Z[rq,-)-{Jtu5GU^l\%vDDQ6*%}b.{Ԍ'3Gy[`Y7vK$,nMT(Nm\^-ddKKc$tG$f79. (<0^x̣[B&ڮvFĊس9ۻ9Q(b| Mݸ,Y2dɌqXz.1gn~ywl/IC 14 Nl̒`DC'fu&R\uMNhmU?Q~%n+RיDf3XX-C7X3w]98 g?~š(<]k1!cR_a-b,֚$4\R4t6rsC{tN;{Db59cn jMYQYwZMqpb[A:Q*^;5 !!]ѡ]6s A󧾩6(`8M¨Ѐd;S+0l-jw>$'UA9qh8Z|&|kI= uJUC,hC`nCVYhcA5;dq)`HrARG %/3@.|Nߓ֬(}C2MUmw6Lj]t8kPD`C^Fp}D@u׸I}{WaS~dlH\xD\ R\Ѿ:QhPS)ц\N 4Ruʜ} =- F%ma㉺ >x`-[w>t OQΜYEc O١apxC9É..KU aJz D:KuIN7l3F^Ա7#O4+XGC>K̸Č>7SC2|2IsgfYc<\G ;Hrd\7LTlٞ=2<0&UWHA^RF= lzL^S Bqo/z%g; {Qa\z^aʇcJ >he£> d،n풷g@.˜9Oc񜵓1h-+ǽX+EDmGbZ cѲIadU("‰Yq<|zk~R;>8(8[ c#;[/?4ɉ+fgݭsvZ{QQEu`D73Ht8 #]9$!q9{rc +ˣ&*n[9K:g¾uG*f1ge;𐭒.q%G` qu''#wܠ˺+*ٜEq^qѷ(/9'iH/'GEco=L%QT㒣ZrT> }UsTK \Jn>4k;P?]q׏<4Մ?Ny:W+5e|͹NXijo\0O?r?~m:<=NB|;ՠA8&'져>0r-gv"=$۝gнsmF5܋!g N]ڷ![q1uN$p]dj.&h1W@*5r)'s< 1 -+< -%< Jn -y%9i xm|I?ӤcK2?x:8ֱ-}ʿL  ôz7jDnhTcB F_JPn%Ɓ%_VqoIAm` )d J@us+m0fL-$kZd0hh@;P,})UDXЏ@dƱ!n9B$<O=ײ}WlK*G4نd/( L1 6JFd7Β+ "6{[MbM5Wa7դ. \ɶr9N5W(5YPu½oQQ̽ }4&8^CQ[\vTrʡ7谈`yk s,di@Xr,:amzɱ RA FvF,9%2i@r@ z*Mj.' B7v͓4*=)X'v W"sE.>ڐC#V}Mj!ǫ;kZȫ{M-8;"N>3GkIN<[KUxv4ZQ a4:Fބx`%T5%ST۝?*BWDދiP8%ElKg(vU*H<&}XWdVtې(;t1Tfĉmwh-0l-ıe'ky= '!KWC<#QiLLDL/E9vdB ['Pwb21hܚ;T:N4Durll#ꗰz]6 rsv5Vrk\h~o #Dž+] ˸5_Gpoo??9">but7f"m5u-:q|\쯧gkHH˗oFO흫I=*`ۼ$]cI=.]WU%4m UR%&ܓDZ\ HVz<*$׭=Gn>ij7djNs0S2ш6TbEvaCR)F8e13M8\$K%Щy_NΓp_)3$=cDβEN>&#V[kD_E(헷M;>ze-=ԮŞ0bYXޫf{f%[{l5O$Zp-]\_%`9uVr߼wmbL.Ce;#!ZUF'{^[}\qp$+`dI7\rO !Zj־ bUש8Y{ρq %j0 *gT\q4h~F֐xD4я\y `2}4GR+WDiA  'H.`N?z-w:(=ft7pe6oQJO[܃B|A8q/d}ujBEjqSu M=kvzUZD2\w.A:'T.LFx /=c tfsdwJ*"H- !C$jK!!E&ӭtC&e%,>MWj:Gݘ_3ߝ0#37!b?I;m0ntE+78kVk{lMFiG@v?p(y78"t|>A*bw/N }!X`fmzCqg2ΏE<2~-Z2Mimj;vNcw~M-˂.ɕ[zX\*bp_DU,G#bf#%%ux3}v4AN[ 1nuNx v pYvlؒ}4it>L~)6RE谵x,A0BE11.U}<.o`UÃ.͹b$#s9'>-R,vn/TbʔgObtw 6]NYf#m5D_`;+${UL#,VR0Ԏ?} rՇgB>LR ?۩+$c5& 0h՞r.ݿZXs-PL=+wmXo`vT6vp!1˦ZݕMo5.9˦ 0P2dD!#Q7 A](ʉ)6>/G#,s8x7:6yb]]]~ur߿ـيՁ biau+W/NcUGfdؘܑr4aG7ަ ,[ ̐C2pUg=D,,!`U973]Yk2P9r$g/6/3K=r)3{RVG0Y}bÐAXakPBY >nVt4ZL64UXElhwbf6/9kGZD)r^ԙEk$*kWZ-F+w-&t[|WAEsjŜ-^&@ [džp0>, 8~VsrcS \2'np~aI gszwFTɝnkE[ . `[I:k*(+wDsn<>hIS$uuvT+hl}؅Av` nnQ3m"i(pCNic;-(XGYQydT_?t۫e#Ά|SVQۛW?WF^,Zu(4܀9 ,?q8cם ]<IHfm2'-^vZZ膃BzArj"gXAartOxrB,iʳ AG-E[ji= lvK0yG[3MҌ暒iJ+gDˀ>]aUӸ1JI kԏ4x KEC$tgMN8fܜY3+͚sgUChNJZrR+'>5%I"d=?o~y_3:NLFEO>`s -4t'f]\8x1A5FF_8)e"T. dpA{HbI4g΃ Qό+H5Ad~Jn%2%GhZ3i:1LHVYgS}N 䪰Q@E Z F*KUZ:4lk{K/2zKꓰ]g ]}u=ӧW#` SKm1e]k2]#˲mqQ &v>qXР.8bX}\C.JR>4R˒^ߥΗ[uhhA]Rt7~dn !vIwY7yևTz2ˢ!d>HVU֒C|嬋tU:u"7l(=@]kgYi׌`¥ʔ} mv%.މ)ZW>ks.b@`J)ˈ*,jKV}zXv}zQoc[|mv%T-򻛢,t_2&V(z1^`?q\VymB_x~VvU SA P9Tw sbM4>Z5JeA;GXxEXRƨy6 J]ΦebG4jZ@6W?ǘK,V $@b#^Eg LN%!xJ)tʤa H'݊Z Oh0O\R"ˎ;t&hN;c%o|+ZeS<SkvzDPU9[ B: JUG@A6h`)qt"r+:diD+yM")m݆И2a4YQtbhwHW#ҋ?' 2+6NFt(K1c&0^+XY]ʹrYH-(RcHx`L^V̹ȸl 9dDeЎcO=e9c$s,ZZ뙆RBtL<$8 BJ9{O/觊$ 5al2kC6d\2/b#AE|)iDDbY:,!  Eb2p瑬0xih6r[j9[Pvy9:)| tk!fFgЭAJ|IA%͑qY>ʑV% ғ> *TH,_2<( Hse+R\mZ͞zpzt_]|Ƿ6Ȩy*_΢DԉGFk+->4c`ZB.ㇵ*m ^dѓZEn8A7 fQH-hFZ򼭠TGUѨRNxO0zӉEJ2D.(-h֍Eq$pذqΫFzrh/޵q#E;GUh ~ |ڎeˑHpgQK3ɲX#CU*֣n\g Ugi1 t!qrd$%XԓoZ"V ^'r#8βaD\XUeXNy4IZFƙU0V9$"VATLhy49 WHbM8e,Sɕm4֢<=E:dUj"{9w&nVF4mFc:ZahC"y]]Rr`NdbvNGZ^1jzuۇo Xe%||"zUAP}weUQɛU݇b;OCW#Ob k?vgοvC;+BKZhdAXiE:&Q3bj+1akmr.;PĶOwj4ߨd$6 t1( yѱhHYx%K;+K:L Opc؂8x1@QonzE%%% L1FPj "e&T[^H T `tBF q Гn#r-?Z >¼DKFvߟz=1Dgݱ2 m-u@ qV)eΐJ՚)*IbEIܒWq8Tsʇt-B0596炙sA& 8=ysZ\_}͸t&Ν EEYmJ@V^iJFcBKtڀEÈ[PcK "S۵Xʣadv:6 u%=Gܠcnp mCn7/ݓE/CzAސQ7?x/᜷w=-4aa1ae9=!v8^%v6Wh d't^,Bm ^0V+^׷.V~%kTƸ.BJ]I*(U\,rM*fdU=+PyRNf%jg0W EMeRИqj@ƛl*IJl+kA'ZKϑ&El7@cs߷dȚmB2.A6 \^cꈬ , ]ΓhLʨ)R|ֺu&-x4.?X j Zvxc5WܢXQNY)F1mKQT ;P1Y3qsՈ\\ (DԈlc}nn|Q C<˅a'GZQ-u2uoM) MK%zf0n(SrŨD +YPs04u~ɬ0e{ޏIߙu#.TXZ{ߡ؆ M\:͏Z+UV(djA>/TPC4V1Rզ0e s)o" R NOɴRJs)->hvՕ9@_fpCQvŶ+`+h] TFlF`*4*#"(:Ek6)#;sU.LQ# ]˼"gk|v 1nS;p{|]@Ś qH[I8*! eZj4ْTiYLyAT(f)2TTS*h_n Ȃ&0wVhum-mzxl.4\]!ԪS_jZB^ʣc-;& _ 5R;n}XBǸjeVEŜ0~ԓf9rKsQO{C~/#.Hp-G77*~ 5]s\-P\n}u9i'S_dO^Kxtv?s 7'6U긍Yznlw}V1;)of*/=yڄQm{0\y4Ue! J6;︢z :kH;˼nv='%ޗUꘁ+,lg'o|wE7hR0nj7`NlTZ':Vys@':qEH$@Mxahq|t3/B8tdÛű~M!7_޾tvS'#}ﴽ}[vjrȣJD DnVt\Ҋ}IO,".o,ȌOܢ#Ƌ MfYm6TYljQlZH޹%tI$Zp'zfda< +vy]q`NfsjB\O<{]׵u˳u@Bj@P{ -͡wuP{]Gno_Jpm_11 aԺ-WeMȓmOR%6r"F19BВQvg.[QrhWa2e|Ze(H-jk,'jo#-n/52ނ~6]ܣΠ Z} y^|)fH;mq (;6;Md+z~]D!]šq헆![EnvaÆ0l${I6?3YrhN`B+`DDzfmAǫ1{"jb2ͮuEɁ9=F;,#ko?1[f 6޺hxK[^Eh uEɡ9"2zo9_Ze<K+@B͸Zk@Kzi/ߌޏ=&7(ހGgi(Xc4!ȸ>';$4q|c-}npeՋF<-rj(}[YH(>(D w'UɎH:&3[]gy"G/{qW4?Ъ݊==zFGUAЛSghkT>e0}gNg_=3ó l~\m3!6lp k=%DsU"EC@ B-HlMY-&ֳev 7s?o=4~| 344t5PQ ÀvrnvŜX]s Xk?nX9eo1Zc͎aHinvE*R*w 2u5ځӪ 2 gj>; oԌk·%O^e0P}IjlkFw V@0gt 携ֶUcW"dk!cGuluT*\u6mQŘĢ0ZBIck*6zA)Xa?.ƗEibz{޾;B(TW S';A]&'f K_y助(y1U U|4X*jr0BjKNC=]'1SLY+Aе.ݝOu0'*3ā!'_,"ǁ"D* XCbd4Qf1NNC>GX;\'eHg /~;N^DM^h {/ʩl\sFh\Wc69o,y#@s6;1=V>Ͻy%A7ܯ\{ޫѩTtaA<@:otUS]c؏w8ŃR華,w/m; wzvz`w"МgfSNީo Ș =.9sk+@7ԅB5?u 7SeS<k^kd [1秮kk,o_ݽA;9=n!4Q^/zo[is-mos~ZO i1 2Yce (.~n#?vimyRQmj6y$aͣP0lA~gl 7ftymGd|[ٱ&IL G5'y+w^ث>oQirTNio56ts4NCcE+}|SߤFYIYQE<9҇H8yog1fOjƣ3ofu36gրO y?3}zxԏ}XYv9M),7 UaMXBV)H;H FlݩqK&^IMyO bVMje+V3\fɧ}*ךŋJt3s/%ywe N+lOr+Y:✇9~Y(G-QnVYBFvDGBś7bK5qh<[6 UfKp Z}ٲ-Af-ښ".",OaZ~ >|;VJbg |ڒS}9_D>+Z-Vsg[^ck9{SqUOKl ޵q$B.2R/H9FlE 4cbH3/Fl23ۛȓMw/N.ch2ɘ<3,sH}PX%!Jp)%,iN̸<6bD ҙ|5֌qp]?Yc%5Yէ1'Ӭqmi@Bޏpz7BJFROvۯAc۷cJ&hQ%PŻcdaL[2: 6GgUU@Ѓ0P4ea]ozB"[H@{M gfV#!IJ2N w#˺s6 +Iqtfh9k麫Ӝ)zNxJLz3C* |#~ o!-1͵\c|CVB~l4?]`<;| VXⒸ>).An)DOځ-WEK8#x7/ t|HdR1oB8 Td#DPFzʝW6j9}|0r?fi!D=l hΌgsy7)wTSZ*LtV=2l@,3KjbulaVxb@^gh@,hB{Pjq >aשb?}-a&>3> N F09|[Cg9b'|EH*d{vzORy)muր';~s󍲑5@Gn釜x(I^@RR*,y&_Z6BS`*|~6_N_^ ]x1^7sCH="F&pDiBJsE3T Wo:Po͘#]K8ǩ馐@HKfZճzzʫ'.nwEj7f_[!b?i=Cs' }ퟯzq_0ʠ|rl?~?f}: $AYJdc]KowlaObN7ff`@̬R;qbXJ4]#2J$q§_6A ~qBɸLN9O]&gC3z7A6EͻgO,;w"4HH͢5.2SyasRPτ j+z)ViƎu10neY--eb܅n׌ZctZ`M3L0&]ȃ0~GEB34#Q%mblF۰C2 @tng020v_Ijp8}ka)xΕɩa80%_2NMǻ>t@hG/@,C*SLg[߹]y4-T0 ת?w%KI1hZn|0);\sLaKt:?O7oנS׽wwI-|"4i-u! kGώM/k/G&pJ)\uXEeN"r1J+ Z *`X+zy3fy %?]6[$ߏ ,DvT 4t ,0شZt8=NRUSr $O6\|Va%u^]^B@J1F/ ҿy{3w{Hn(|G*x?MM 2nBr`D@81Ag"Q2ڮ4b\uS'vI=;o6)G,hE_K_ö&u=lN ~U&̓r'UBl>Ƹ4&y|9ƅd5\6$OMŢVQQF)3pjiP36;֌N>d)$o,5q5@jߌAMPⅺEi|3|~}4w< f1crH8+ʊ/~'K.[EǸnqSh/⳴ +꘾~)a~\FVoH(6ݎ1_ﺬc$}RL%:pnWQZPf92ɢH=sZ:ݺOWo{ @X 껩@kx`qՒv1ei~hL+O>xl2P5L(.f^\-?&E(* Qhv(ʆi' ҆Y[(&뚘Ǩ^r%*Q3ҜR 3<[3FJ0eXt6LUR;X3x4/SS[=d!E5;5.$^CFylU3Hhlj,\gs2"Pu;Fw> Z趯AaAW^;+8jY픧:p*5g"~¹rC<@-gLͳ&ʤRZ)U㉌͖PqDs i?TS~QO?mq?- Ә ī߫Vn%EÅ ]&΃=͸$c؋L{e3!^[>Vb/xekIaq4ԍ,QUd@!Mı˦Dmqlj(nglSU}l*#)"!S,GScPΓ K ٞ R3":ݚXSNrbjlBBC[f R& e R& P#ǕJ̖PDDX)2"GNZn :*:\z") G@5wA.O硹c0H0Hb&B;$bKd4!2E/`A2IPF;bA8ieh:P#h:xuwP8p,-843/ RQA**0HEŢEfv3/hwLf"+mhЗa7X_İЧMGdR[CR3=<$1p:<[$#<)BEH of1s[s&i"%^DԋzQ/ꉨu!lrFz䜓 S8qJ0RS1/*җuvgF:-G~z`%{/>xj|{g+7sYPoxC;oP^ 6#_jt<)J)w){tӿ)"}ų+ɞ̐`׶s 2kǕ̻(OU4V[[i_-2nkN^c7c)tًޭ~;)9~\Sq0\ȩBȭj~ᙬ\ڌ"+:phP,#-Ed88rV购HP}> <Y\ط9 ;}SrcE¼U\xK xF:DQ $"VQ8|gԾm໩}|ܲ6 ̿K 1Bxϣd[ `?m-(,PQӦhE`!H#u@Hb"r7^F#i:yߍ(R1u06QR E !'JJOS,Թ%eunIQf|Oun[aV7{9/BZbΠ'F@G1cB˥jl+`I*СRRA贓_mxT T4O%)*,=pG4,= hA=45cHП:d_O<*LC)&`4Xӟ4КN&9K\j>׶Qx 駟^]IOfǟaVoRkBqꭉ_p" z^RsR"%nqR .S R3HC\^F(X⎶&9=w8m01P݌t+RES.nagg!L}Ԫ uX~|or6iN7[0nT ~٦?]oTP\"!0î 9*"BHDZzBvMe=]L"ZID2bI ;NO=cQp Ry G.(8(0p fa,FzCF bsm!9 %Sj"Dr8^yƃf (̜D[F\tS=f%Gڶ<4J[Ni`lHɲN9A LMQZ9ڍ w`r6mPYL"S 0w`" S8X=|bH`QtdyeX8D]y=^.{FNr"Bx'z h|;ɻ`t/IO&)qꄛ)]~ᅦշ/{7i'fͼ7}+\YD3NҠ|[_ MA0 AkHk.GX(ZZFуr~AdE}ɋ; :zFޫ\CۋkNafCmȢF|@s+ƵSOi3?!?@g#,5;mZ<ɩɥ|Tq?oe?Ǽ扐%0Sk }n1FSlaEfnIhB3ߟKF4u)K]l:yUA5?wVw>&3I9?V5tah!喇]UvAi7v *<ŭdxvE#h]f9=iZN"6B) ],,0(;i77+4 __*j/qks2s>1z>^\nMoI_@%_lj\v9h SqdU8Jf|Jx{fOy5)^XW^?] n.:KװÙaNNk( 6k!\D'/]fNJ߇Ə+9SՖj3=lCםHfQuE oKfX ܜh@mF|iPUWP)Nd8x*ƒ+Tʷ:@p}`}??6 y; 4^3PY^`zG{0|I-U$J *1Ub]`rۍ5SlrJrmB"IsdodR,C8Ϧ7Kړpw4Y!kl0EXՐ,c+|wV[Du ft| 2V!Y᠔Cz Te6 68QTd F  ąMÔHLHnAR %d]s9$o 根mh9dVµ;%E;Jȉ-{v4K4K)`Y`nK+P eŗUCn)sb@,XXULģE V}5*7]b~)kz_ixnϧmFWc1ta]|' &NT9W`~joF=e|e$]*5DnOUlM.'嬼9֜Ɏ#kF[bXLb)=r)dҶcgSigcX v]3Wq08_ZSǏSɌa[ W`a/Y'+>Ӱ㼃/|)W/+Wλ%!"I>~Vks(h2ϰ, =Da'b\J1+), d8cS>YnFYEf4Xjk;M4˴6X CvH(= U#]sѓCv4ϴ8sZ]Ysc~8kԅuGr +WS Y^fXԭm1ׯ :um{cVt7_/fLdt=X‡[s~܆XRwl hQ7UZH׉̯W}s%vv8[m0я: (gkh d𦺛ā(:puuCMAka{ uZ^I`]?{۶ι1}?Cnғ I[\4A.5H4~gI=h=HʲC p$jIfvvvt7}?^㓧0?:OC0$OD_d 9˽'C=aэ"⣱T$˽XtW% K@ƽz|2p5RzSx C$u d0A1wᇎCCˡg0p3_̙+HM3XmʯMY( Q%Xp,ݸ@zF L!M }+ NqW/~ 8Grpx,n<>;|ǯ;Y9; Y7I{2}^gSf~? S3)+`/|MVI# x=XYldl f^bAzr=}_cY?iG\1lESZK-٫vliskX ZX|?qIf{7;:~_lM3Ϩ>0x` 6&bmI˷>o6$5-t,!R+ ~ԃ8HF #-TX`- `-.G~b *1s(K0ԙ(5 ;F Lv`8D\VYYɳs*i|~i@y*Aktwn-vQЮTߒ|^9;{߶!Id/5+@jwPD2DR8J0υ8Ҙ3"WV\xnmB#f ~DK+}IrsH[uД"&Y{1HFC]ov^!-}b<&"M=hU^c;_xyHcƴXeERX۪3w >4@zuh")v kB&R` 牊PNA[4e:u7K %$V^$YI]OU-׾s5qluwɹ:e*W#U*L{yV( PCM#T^!*8i,XuL#3q` "XmcPs1(8h_F+tC\iF돟{t!ܼU!"1BϊhF]DγH(|M]j_ŗ$<8D-8`3ic$O5sZ4GڌbuuC&*M&uZH "4QrG7!t/ZpV)B^m=},{6blx7 wF3gσ͋:"| `#<^S3^z9pe0X†t^.z-d8:g7+hY =NH"l :4`f+3X3$() xNbV$? ;7j=f?f{Syh~ibZ>%݅rk׸Vnhwxm<]7%M3Vmnw.nm\PVb"xlzgXQ )eSm`kq% *wM@}L5X8Nf->$(N{h8sR^֩Wk$ׂՊL6ĔBH{/[2_QBm935c"I󺒔v장L(Λ/~?(N)ivl4]FOPvݖ홋p2%5 7XM* S C9-$pLHA{"߽*Fp5 ǣg?YÎ޳etS_\ٻK "P_*}Y: abT&i{2S[-Fg16DіR$ȑ8.b(HF`i)b~K_*5jKo6"9K$",3FR [N?ŏO}e[?k|[) E$PRڈ!$:.}\Q꒲ŏ!1%~ Uư@ư Ql&<^3V_ Pȿ~/7_vɽ,Q70h;拘!(,c5_/UR u0_hrDu=IA6 " ãF3ߪ[N,÷ꆝ[83x[up c[evO~/$ɕ~Һ~qeO; O;7we8%5.qJT&QzpgF5v'"Ő+>`+}87Yyrq,tcX>ՠ7d]T {Kַ]nWhEjݮZ[W$ضp1$%l$J#,bX5{ e  UKLZLH$[`F*%E [`1Z`Ɨ`, bR&:R IZNch3YyB:Ҧ17Fc1xF\I-zKBG0/duncH;wظ!{!C&OAO +y8ǵks刣ދLnr([>Q"{tSoctJs}jQF M*猖63_5hRbvqnݼHh.D& K~eK6g6qzb?k7Pn ٷ_uvK:rz7=3Sk JcH  O$NP"-E8ˉN= }q!VzВJ;&PUVBwb@cc'D.Y60V< qB53Epۓ4'PP#!9\)0J9^xq>Ƨ˭U3l3:5NZ+ ڠXhSBb"JL&DbB\֓1~d)ꥀTK@{H۬6k4m۬6k5>GlZf:16J5aceeD=2Fk"ڬMx,pOG- #Œ-lXrDҶ4D%[,YKG`ԥDIMĄA1% KrJkKCbLxX2D^vJ)]"|0b"vvoKdܜ{K } @e]tBG5L::=6y袎d}Ỳ37d>C1b/tM; &9Tp0fVθT[,N\J)2XH;I+јBjG!u@& kf|x=ɽ-!^-tLFMngf䝓c}n t:tOO1:!X1zo34uVebhإW2 (گg _=oE(ϒWKj`.VC&npټFˎdz LvQwS8j(QFL )q1g ;3EB'ځ.LN9F9Ei /FʊXd nshCHRFy8Qz֗qk0$m㾃 d-/c%p+F±:?- ;uio??ﻨ '~8u%u=3faBdFZN6.Q0'ZK$X`HW!ZqEJ@?|WXRcXt|ꋺd^}aV;;L{w7)\3u33dK=`3TZwU)M_sqyAh yb<{ӻѼADDD3.j7OAƓّ̆̕FٿB_񞾙L z g4}u5@{ʗb/;g!͘(/Ÿ93)Y.2P$hz޶pA93L > \2"Rа VPjSkp!ck` Y%a)Op7"WR~^-cSr~/,$dADFMDDDDEʸј8 V Crc-6V0(h%_ 0ܗ Ph)ch+?twNL*SB#$" KU@P)0S&iG V#' ܫoX\ށ~!’G{HUoK3`h !w15_?~ q& ?]$0L##9ċ'4y)mdIwy6SlbSlv川]wٵx^Dd237~,.5@b:9T 54nT6&iA48Z JDV Q֓s|ɒE@{0rP2X=Y \W]'́g@An/R\c'/Ei"V2fEFidsc"+O;Bpv5Vk@JM1) t6Z﫧斏;S{,^C"]U(ջ 3<$Ã'r@<7զjf~w\jϞ\dɅ8d'!{4O4Z'lt+eEpȗ5](!Q2~pWaV{T9gc zdfsʗ:TLn2#`-Q.*QxL66^易Zt5g`@UAwX0:AEX0|PbXcXهRj+=RЄcPdEtAZtء|~`4.PW?f|pkgu׈*}Q{}"lwn=9x:kEa-LQ^%%~Y!ˁ"M0eޚ9"J3ld(.ҦPn-"@UwNIx+iW( \yжA*ɮ&⃒aϚ୎:p)^~*?ƉxH(fww>Ppxh4#x8F6X(6%Cok$5MRCA[4堊qp' C8 110yIL v Ds81~{R; RŸkǜwz~n݀WlA+]6tA9, qOn5V6 9P$[4PG&%4ڤUٛb'jyɰκOթ@JXwgj h~/W>临ڦ6*ea)фа+D.qΔ[xJ2iG*ILtQb81 ̊4xzЌ3IwX5&CB>J!-y*ƒYַ B]nZF^eor0bcl*l2s4呤SFc`Voʎ)QvufdAR$D,q}x۬= MmHkÄ$ә"V#Pu%#9 ױL1$'lOkNvU'/0.x;r:8Or&H Ȏ339",ϭM "R# (k!j6d1RZЛ=NAG0qrfm0ğ߀ej<דR3x=^ģvǴ\&p(.3VI 9<sZ kXcoGlOӲlx6m^>s(o 8 Frά2NVf0HkwAzA`_toꁡ*CIG~S)Js:Ő %xM) hC4ť $– \I4ŔOXEwl :0mv 63+afl%k}[ȺZ (a4C guȌbW+a$ Pu"l,)'lOӈ  :̘谭>f G{Ւ~X0yA m- Oy+|e[V#~_z|?M9zv k:{ 3S8埊 )Jk :v(ڷWW'Ǒdk(^w24L{t/mEmZ׿T]իjFl-!~:ǭP[+=!i|AVNmimIHuQ?'\s{F̯0{?fMkm21{jhg︢UU0w|C.=m(vNr{zb7ípzkW4q秷r| ӋS2&׃Dmc+;ʸ-N,"`).n\sˆ4+28X4j <;lOQ]doJdq9+o Vp6lP_صQ`;ATxT Ϝ(.b|޵(hH*fOƉ"]'H43% k2Yč!_; .Wc=Le:佤^\1: r|ch5齶V\ [,_\JIM,Ǎ+2lIƕF+R#UXu;Γ;?%Cu̥ش*oh >U5%Ѭ0Ɋ U+7J-o;CCA,TDDħYKX'b_0u l[iSh(SI-4)=uъ3EsFMg.b\~)N|n6{ ,7foZZ}Fl^o.׼a捵,V[a_KܞНSlǏ!zI]SI'MZ^,m~|;r-ws܏Ws{Lmq%s{|unũxzS(MzodrT+s~@uu*7kQ38\i-C-YAwd6bOl%v<"u*m=~X&x3NFr@tQ7F;z j),{6h,G,g_|1Q;R!hocM"1)1.Jufy<]wY5h_0ءG%) D[P?fK"QX0YjWFv]FrL1rf (Bl9 "ARDL(f_5h7EbK:dl~KWV{Q 6jm&{/@$g 0TuCSٷD'&jpqT<.{PeҗeW3=X1'ZׅuƎƬ U; a)`6 x7cwM5s6M~'kփ^,$xޙzWsw\5&$X!.'y\;_/Ш -_vi9mUw9mG+X>d"$hQH'um\NBvF!{QQȖ`{fBdۧe,"'/jhqqN hEY=Yf/UBրW_8% c~& C**LbF4Ld3iC( v)(w^u_㣛 _5#z=IiJBMiҔ^tvrٙlX(и EHΠ1̍VɆ6ޕ5m,˽g4*=$T'(yq`+H'u7E3XL _O/tcE)a8L4\H3ܿZI)uw1 11&و= <&DXCH9N *!dGFYĨ6J`{C5|Y<2ľSYd"VژXXco0A VHX۪{O?r[UbiJ6D+`{!c"M|K<1DH#%0ܽ 8Ts RmeQ̍DMF4B}eH1ȄQ28%L&N υʊxPY'+TVy93V+ X2] L+ޮRFLmf;&J8TDq́#HG$VƄP 0Λ[:i^Sx/A2< Vo.?Glz8MȔ\=Y|o=@yl^ǓG5x,l2 ˀ 1[Ę8Iθ5y}4i̚]ZǠil`b4|uOڳT%~Ћ ѐo\Et*sáu#ٛh-Z7WQ1XBSnΌn>4WQ'bu e[Y11hQCCqu)Tu̵tՌXa@珏B?Rdo'82{gdNr"gK 9zEQ+Wn-$=(fasUg*-rsUˠ%kJoyR۩4L j|uUۍr>hG`!~,6us:f7u/# ^Km m jJ9)beAb@#ĎLoIi1+C>&\<}IFAI-;;džvJȖѓ~Js7 Rjpݽ^ |)q>)րo%bƋLr4!,xXS\@P80 9di3kf'lt0-Rot>q&S6 IV[̯1Y[̳^gJ6ֻS8'PoH2'm2^^! Xgf㕽%F$!'0&k HqjXic}Fgr6ѥiE6@>Pe !k %P 6~2<"TPj)09mN߬69]Ark:!@ ^% vk3"Epw*df"]ڱ^|;*\gӧ!g!q8"ؤ=M; tPhg:Eņ1Љ`\_a|dODBL%Y}@ΐ_`LV6aO[4Ʈ@tg@1ߢm RF)_}u:v , @,S~9UvPh'3ʆ_(oN~UtB6d-¢;Ӭߟ/'cf| @fH=-ˀqqࢆ9DS?wvAA~@_p*Á -ѯP>y0q0}\l$RL3RBcq%~܏UDdK`/ƨc55$f@,Ҋ=8="`Y-Vʡq>OB?2FE.lD_ TJ}Pƀf1SjjK@k|Ƌ`:y2/N ƳS}̍Ch%ozGI=󉱺 DfjϦZ:ktLigeËjvG2*&QY^j)QӀFMpRrM* Ƭ_nW4 59 ҥYkvCu)[X]1>=nA@h:eGh}mD&|@R1 j*}7 Cֆchq*;6^ď"Gurژ}2 R ÿɢQR;{+m׿A_KF:[+&}l>8ۍOkpnQ&:wO哅&ɇ:ORN+ўJߧy鹳{elkx]m; x`RE+|j;]zTk/zҸΙ1zG?0Zٺx#[od⍲uVgIX&h(DI M)# ( 51!$1oR+Rc ,7)$WW;ke]vjګ![՚ACs 3z{ BvFץ;GtȖҝlάp4W 12J+&bI4!1DŽf*DE5u 'Mw|W;(\ &y&[ih穖0ouhy&CoV_yF#a%dD0 NFfme8\ }iU.Wc3\|#fBN@@(qY* .Ɉu<bb^c<0/0j!"qs/K^l)" O"+ {&!mQc!C BBD:IF1?'SNk0&Vzdǘ#V# zhnR0=svHm" D!#q dIb°Xh gH@j "X~"vD̍{kJNK1fXz m Ey{,]Xyse﫤?Evj鬾=P/ ܔyJ?~5M%0ݮ?f0Oqcl?el"E4"XPMi7w MbdzhWo%0!$m=x6'fLK*̵sɍb;a:uʃ=_qZ}~>ECE0aBPZ2<|$Ɠ1eX;xaYZ9G7"NLh}>w{dcz J<}eVFj0,3M7?ko]`0- V]I?-VbOFbfMCd% $-()Ƹ@VQ <{"nC7vpn蛾X>'xqn޽  0g mJUbGE*1 #鵽lJ,e ٜ'b.2S=^s Ъ%u)Rծ-r#.9?5gȅͩvIePJUAFRFwN6{aiM Zzn㣷.`c*[Pu'9C7Fd2ۍK9$'1BoK~~\MoՁk4xw|`&]z5}o%M$s2eO,JR?}n2D-}h dkͤ~+h F@S3w8\ ~/j:Q%Sb ó笡? 5{ qg[U7јW}FןY 64Aڰ 7)N4TJiv+⎬⎬⎬⎲mH#q# t "B )1l3!Pe &C ­\rj:xeƀ#MYRĪjc59J0(.UH&ܶVMQ(8;RJ s~s)[C׊"Q )RF>wOEJ]%؅2,רeL9Ð{r\P2TW$ǀkpÑRr! |{ ȐZTf/5&c(ƃ #-?߽;G❉%G+Z<3ɞ+Me7NM~>fq=ǦGB"mE&XIӘKtІcaLL./E"!&Jcb4o)ZCbT%,ADMٻI{W,tQ9ՊoBGB5P%J@12`mLR"I0()OD(bٻ&$Wagd}0V;ak(WM׋i)NjL X}xi#*$Ц+,#GI r:&v|9>.Tߦv6S[yIZx394dEGE!G"%I%O5r,9b XXoF'qRb݆jU\xyxZȕT]HYԋ̧(Sb Dy_ݼt:b><3;{1?A&#zwwtd-dK\ %Tu1"]EX쁽죷/`Ja6nQ2u4c[-Щ4I5'K%3ܾy{Ǐ>. z;,m4·߾ُ/\t0&#oKM|hb@n vRoY1?ߠHOCēBwt8{C@D O4Xk0 RafXPqXzN4VUr d@FI[ƾKY>?N{տâ09\Zm#4N7`K$5?cEf6 ns=ʷ㾟@o kaڏ`qȯ1,Y~9~ަ+x `5'U&j0n>\^CtZu~*;څf-zX 2WoY%mWmb7 W/ (~O O%į_q)%į_xm)>%~BĂ(jcښ{1) a20%BR&BJ&Fͥ#<>jX,1U&bMPUu 9c؋MH !$ Öqi)b@BǀkFo)&5W|#>G":[+<^ Ukx//\'xT 5*AUq/sK6֡@1-o:d`WN2$@#"@@̸^$L,})؋ %Nc.2t}VΦtSϱ&1Ѕis˫h2;Iӫ:w$ %V͏ .X=Џ~2s"n$K׸50HZH$ps,OeD^"0SMv Vk<e{DZ5GSxJH(8loU_dw';I*Za*_sz=ʛMkcP.)'CC]ݹssOo/CeVXXoZjy:`7 /y)|~03DIa/|ypB7w/nfM8o#osfVgkɩ!Gt7`7.Mf=&ﻟ`w֢RteT|/[{NXpء}:s#|^wBV <ǡd5p5P6gp7NBLⓛzcxEqO%e|VǮB˒.抽ŰU֜_$mrx\)Lo:N %NmT$)!#xNq[JprDO;!{UJO ; Rg9*)hylt5EoTrU\U KQl"O<=ˉq/?;FЪɋ0T<#،‰'~BLN`"V?5zrWtp" iGڕ3+UAcʼ.FV"4Q.9cR+s uH|b{b%9r_i]uzy?9Szk(wdӁ#a$WMSxxk@x іAwJ{=檉@W.z_.-Va\jXN6+%xᔒ*jZ+|ETC,v|)TBI}xGlEq|:far>2۟-$dWg߁dpd ~jg^]dqLl mti|rف?Li=hQh Ҫλw~7sZ7;%_&reb i>!.ۚJqˈDӊr﹞z&4/E\*N<-ṛo?S(T<|iawe:WDxw%؇WƦ !aaLax )O f/Թućp,=APcMs<*R0 +:_!|4qmO)~ΝU$ CENA埝tq6~Ё?%NSfֲ)) Yvv$BVY,yGIk9Y=#qR-Sc;ˈv{mGI=&8; >~.w{vGk6~1|Hd낷IXpIp jd A$ X*yS="+rf:#$L@X! b勤YA{F`*ٕ;U)~sN.TeBIZSRIcdc/xЯJW$BksY\օrY$Q6>K_ R`&ڧRRs[ 04y &2[Q8>.:&bv@JP*P\°,D/P5#y n)N IH Ԑ +iЩ*Ě*#. ~yM(Q-c1XX ("{I*t*J# a5mGKG0s#)y AZKBN0)SAJsGM ZUPhEee6H3wu_{[MEK cZjy5UtJ'jGSĸ?3)(9)5CUIQbuMDԾ߃L<⩹([WϔvrxsE_urC]c,;xIMI#Lͯy4[s ̦֚_ &撒%4L>~fb Og֒=pmǶc[b[{ ޤ k~|7?m1^{.Fd[W2ld&tM<_*qcD>p$!H KR4c\vK0[ev;I5OӤ? [ XH2%OP1U8sG*K!h&7) s \ZE+v6 <n,)*Sa?{㶑_a_\6llERy2y؋WMR)RͦdIU8MTxH0ܧSVVMpUl0R !_ F)%1isNQՑ%{؛X-hV>MF$J21!Y &JDg|n1M yGEV(*̢ej(z~݇\!%4&4L n:xBaK1'AlZ@%#R#'M`On'# Q}Jn.v~0-Z3n Oa4~)5ϔdҗ5i{8G >&ihb*!<9 @OԢIL zGE1UD)y<`)CZtf|u޿,8\w0>] u=B5A; 'l2%}SРL鴪u6Zc%p/A|aVcZhd`/wEw[6:4+Z|2̣ihDӇa8kT<=<>AKxxCF6}ʵFJ7'VKޙF#po;Hx3ٓE S3zݷ7'g< jnEAߞxx[ c"4:a@&&\V|C;EaB&fz 'ۇ? 3퍉3e[GoL.ӚSTP9o A/9 0Ǘ 6:]51RkvD*5Ƭ GpP_6aUJk@k)/8+GRRɋ8 )92/2a6=pۖ=**.:)s(iy266ݴ sX~@>6eqAMQYSy SC/˗H=RLJ\(f)ao<1;'XfqG2}-[&6" du#Z{kApJpXbzi쎷92Jw_ۃ,(Bbuݫ B"APMْ#Ջuq\GP"eïߞҳ4\RUOVO5P0*YLigzDj74>d-̸^Bu$a ńj&zN|`o*$u?7E=.HedUwnN ̢JOL-@m/c>=/O$h%̤wNx,CM)ēlۛ0N}6F wŷW)ճ~ A>@y^-i2ɦ^I ؝󗺃Q}nH.q|R+Oq}*{)R~,Y5w; ǰ_d%rMUZ)Q ڃ("m=γQE-R~Ejo9`8{ oֶ|j$a6Ι=n 'ǀL qԹn_7=qdjc qf9Qxb:kdEppVr`vrD8US[h5(RA.ѵJ@N3//TT*S9#gsoho*D ŨfDw]4uJ~zt$ `Dy B5$duG55{MMZsAt:zu!+u}߀85ORhZ_]W K3xLU:PC :ݛJ*HcƌioSE O><|6O@GmP#@(E{ | c$Isz3\DZ*递iV}ǟl47ɱx6OPДbqY+eO ݋VB9'#u'[E+v ="T)PΜ;㶇)ⱻ;GJ^חRf{tWb;Quw5Қs 8Nf7Oͽ$ ]!̾p)>$'Rk۪T[JRLt2 G3 tL4&*Ĵ6ZךrQ4vPNu1еnYU`T53q%L|',,FUU"Fd;/N+:a7A #%Wz%3Mށ<9Vf}gbM WSk7\RűG#MԴei3q?c}i޾Yp2m!دLd0pG>3YUG'qtECD'!Ixx{[y]Nś;dC2aQYL(I ј201I(aDF M<5-oM}9㗝yPJONϻJqemT޻B%Fq,lKf[+N0@oۛLg+Z]`]oRH8b丹oђj}t#抵}`2##8U>yާ8)CgNI(`i˅; De p]h6!PU6,Sr ZAD3}u*5' H{,fj/ńSӨ*<ը?޺GC"ܹ<>^$g}t)SJ=W]Y*[Url6`rqߵc &sӣ?}\b.)+N1\ qNI{ Td| ziDL4~xc"?yzAVÉ5(Bȉ[ߘ@`wP/qN쀠f_~߽)BOɐNq։4"8-U>|5VZn\VEc=+~eal/h\&yi2DXDDBߓ0_'4HoJ|0B7?.(_RKNYKWHNW4Ymq~~'xiTI{_fjCղzZWc?ckK,!AcJcݯmc,:cmʅv'kݿLC4'&-pt#w͍'DX:fW`v5`=* `\hԅ[v͎?!tS+J+*Ks\ݡ[֤4g:\R8Y[P?,V0ݹU o܄;a)14ɋ^xvHW_%q\s>536 |º8`e)b_vY/_?ÓR3|^\_y7֯|"O8|FW_,OFjꐚw^#9yZ |r/":$r7) y~Eȵ̹{7}2I>=ip>U!.vC[}M׾Zc`oo^ K=q!tlxW @ Y* zf!F!Q̄Cp`䩏P$䒍B0WL:18jѴ\E@¦olla $޵mlۿBACyFQ$=&8N/-!9ȲG{HY^))VVƖD$1$b2aJ t8*V0)*d̄RQHJYWhߓ#&_k'!NpIFRȒ%i U6'F6Uh?FXQmBcNt|d!jXV׳nV3Z6Vj5[\(&/{;34] . 5@PMot.pByw5М:~f.&smܻޤ0ѬXx^:&d9IL>%"$Y_[пޓw%Y|Ӽ!/n)YO>R*ŗ_SDPAKx+ȒN5%IǏ!YAf{g?}i <0a"و, 4$x-ؓNPRL'k!RXu"-6K\ r*YA+HD$l^$dZPC9c؊)%p ZtB[_k<#+Sf?L@bD&nIqT@1}ƴ?N\(,֣G6g62qhKU8J$}Lj$8ZЉ2SS8~H%j%U&c*>ǂ+ |ے)8&P"b:tXk5T}x{_;5Fڏnν)ŭڃODwṃĪ4i7E8k1!u|b :]7iJMçLk)VXS GK,e|疥d~}l|ѩWjSDyؐfBbƑ1$[Շ~G}L'.G[tZDR`mS-^ wײXLx8Un( #cu(lܲXe'0kMBO;a[2C2ƹUEi`9׺_j"1G)E!+ev!{A0§k|XLyY#C-`.3}'$e8[Ma,l.BX_DN`Vf-hh|~8"kd!t"al/h>P~⁳⁳⁳Aъ7'ÇPKUInm EVeJ+bޤiR,A:g ''m5:x25Cl˙? U;*5Lj? zJz⭮O\'gWu5EQ\^P&X׶ĪW}d[y>1ABuӃOq 32iBe1\m.J#-^~; LHd/z|[Ojܺ'\)mt0V>fh~$<`˨&Y䓝=Ll!Mڛ/$EK7( 5`[5"xDᄏvElxmpg|ᬩAnD1Djtdu&0v4w3eu (/F9*j܌>q18WG>CUR4J%E P%#DƑ!1?b Vr. kAݣ"Y \HB g:*` Km }U}HHO䚇oͬ1YWQgұ@}|$TȊgWy^$ӟB%#dGC,p~OD`5etWᒽ P]VLYUDs-Sld-T QYSℊ$ь8b*Q`rHud{hͤf>[N9)lG|-%SgnMYs(P oōŊl;lD+e~]7IyI! *0Dj+_nG/8\Q-a7ۘ#$C6 .5-q:Fn𻾺=.㑐aSzh\%a(٘ޭ< &f}^{v{zԂΞSW[v.hm5-+3T^/}(WR*$l5Xm)xz=\AJi\Z(TP>|\(ᢏCkz'‡}8m1#j)Q-v]..*Ul6uPPﭙ~<޽SGG54xhrs~xw8sghFsH# -OڋZ>h)-ihSvQ뀭=9r=M#oխԊWD1iv"_Hl\~av=믭gfw5 Xm%7%wfS&'sc [fƹd&83Eoa毟MrT5ʾ6cN,SGOkq(.*U~qcԮ |*WwiLX ȩ9f*e8Тchm{AۺuZp9[4S oK(no]z-uw77)8F!n!;B||US0&GVW!)>l>*T#,qdUs8SF$ӫ+C1ىQ~Fh‘bVcaN $ ZUWiT|EO|FVg**$e }"{65E"f:8ڊHfS w0Zv33r{gmG_S[j@$ZJnC>C}íD{:~C9, a$_{n60p<]5h| l6+h2h;sbu'Yj+[~Z("sdH4BLWӫuu IgQ՘1j?1!uF7M3HTQCLfM$ۭEpݵr%)VTHJj'V{<ƸlK4*qCP]BwwĮ.qj tS\=IC\Ɛ L ^]R='wzqR5u nHMA"$m,׿!.uwR!8E@I=-]j5@(ӧK #ce7HLQR,yA+NeVZSlSSfດs3VaB6n'ԙlP*NO%:]Ahwխ8%4VTQ#6b1 c#$DD8TǜitP :fPf6plc ~0QnG>Y|*q?@(Ǖ{? c| =ٽ)@\=pzP ЋēCh %EGmRY%,f[xG&FGHR$5T?9in3L_!;9Y?1 SI"li1˞Qä7z*ٽUH,^AQL^"'y53 &w/. aeG3]n&Watx2ZՃx7 )8]gx8+2;z]mSYz[(> !-ޖBKI R}KQZjo.{J$tt0'! \`;R;)9т;+=|P.T1sdzau0X2d>})U?M9R&_3!)`-:eGҡ/ =z)I̝Uh$S?G_-SO#; @~&~Yܫ(1L&˒mHiH٬86C5VMoPz?Pvu0]rɃBB mȸ&xq[vՔ+̼yΙS0V=%a[-*ƽ9*X@㗄_Q =8s$͜? #ۋSmD\bh̓XFˈє$IM42hDD)Aٻ6nW8+~ ?tl%ix$,P"MRvN ,)ryvIL݌\ŁW;FL-mW=q [}ޟO)̜oEw_aM=IsO9A *>?_hv1:n|<5ٸGFπZ]1sZT3=ߟ_n|_;o_cK!;tHnk.QEӈ7q[5fO&Ō`!ngߞuUw}mN͉VFAY43j4ٞuK8;+<_y8zNgn?i~K] q/}'+D8}Tf;d:ܽu:>{D{ &ɫ 6_aO+(qmxSBuD e !]ҙ:>Wζ*.*#rW&ؕ$[dW.~Iݘ,2f>Pi?yrFdL?BM绾V9$? ic/݇tts} x#F to:E_ :S=>Q^<9,?8ƣ<7!.Ά6ˇ^ )$j- ~Wg{yv{l;X4Bjy.2( 3A8D 䕍qoO3y8$tۀ`,t7R>М͘Θx!R1Dh-&;+Z\hcK>^!ₕ1B wI |Hf+_&GحȎP .zY ֆ]ɍ! %-E IQ`4,KR#PvnE[%$iku#ө Rh2uXAsckG+~f*4)~^?d G65Jbw^o<834AߘAzrO y׃ڇk>1@b SEiDP->ZpZ qmڿTyuVFOT 0IQۮQ[Gq4E@Tp]mAޤ~7r NƖ>'I2u_Z"nڸYK‰esE$6/G8m%Qӭ6FM2О_o BnGçyrh9@kc -vjɴ1ddodfOi$#9cckT5{&y F@Ea͓Fg<NM@+AĔ28+=TM|=NlXsdjڮ5XNϬ{<?ŬS/Y;|A1uH㘤)W3L8O=96Ƭ`Ξif_h| 2)S|!veLʗLO V*:,*Rj`]/J3rʥOVLzFV&\DdUKEoiwqz5Us Ɏbx`ȝGOaW1DĎQgW| :*,[*tlOv@ÚaĮ0W8gRvpLGL}Zg3oti R;-o3 Թnj`9 ;ʟK@ApeSH`=z }%R3yx' -r#IHxF\CY #c`_U@ p+-GZ\R%\o!11F*98i&h-n7lt9"1as͈9W l`9Sf++z6SM3ްĘV @ҋ!!.yj8n̾@6.v48{O(Scݨjdy\ |D'be\\*,7?N cG\ὀQce`fh{ϋ(D G"5PC`8`p\N^ 8N ņKDwqW`_H&VDR0ۋDRT)p~Yay%n g_nDSb{ ͕ zOEq/^|RS00[/.v:+x&6_ =vr72˖q緌;e-㮺e̿@@e 1Z#l F )Q*i#$ ÙV0"8, tο=5K|n7[_c Rb;Mg^8KnV_W->Jc"z5.|ŰKut޴޴b X.% L&Vbe 1-L )/FZl|{OJ [ݪ͟^חvL&yy^u^Y 8+Ana!RJpEa-Q^*@"2/2 +܄(DC(q( ~̝3 }O(39Ʉ צp{>1fVθCFzݖ9( _m_ƍzz;8S=z$[3ŊҦ{VH^R/>O4OS_m:<%^U_~xbҴ}J>h9+0̶ pq?򋲑siY$00)W9OEcg:cԷP+a 3+IHf  J&ctX!rGK:b55y1b J4]N"A!Lvҡ{4z9TL^ YRsi(!2B ,8X. PXm!!ZEsHpМ d ɮd`GOEp%1pC335 D@f2؈2"QzC8bTHN+'P9O_nOD<Pn '0 =ZI8!鲀{-ɉ&#&bIv^ƿh|3#i8d-|%Kxl,ǽE ~Gx=;]u17L ʙG5bnUT3Dh@N3ȧr ޤ-R ƕWq?\_KV(aN5zˀ2%b-P(G)']j A9vg+ap897$&stXp%mQLvҰ ˝B،f9!$eDskIT`AZ 0N81,0LIT\(҉t mFL*?/q+!Pƃ1#%-JL9 ar͐P2,t8ș³mANDXHΈɍ! $Gi!mX $DhM=' 0 A}Ը:5ߞ$a!)N)!YF4$Ynʬ/ œ f󱵢-BáTi UR6`9J@HBB.vzaEC.EV+7sJQq^(HXM$V~oB, (fk֕: l uf n2rۑ /0@ Qj-Y!N!L\BIghmLH`}~ЭGCaZ8L3mYvk_5@8dUOz*ܞ0 v<鸯o"9Y?6p- Rm;8_%mEfLEk6E0ЋL0uE56b@Ø+_TNx R9?4 '@MrSrkSÕhru 1^XĦ,wcQ?}3,Lİ@.@'Jq%.'wq9oՠhnC*iռaO"Fސ`X hnFFɏfe3+F$rU%Q.5aI9"T+![^PPDHӓ~ngM]Zylߏ̎y Llު_v}dxg.Lp;h>pl/ޓucW }1AAK k[eiRl_ْMwsL-Yx63>~WK"eT(]caܟW 5.ñhjΦ> eF7y:L_Ɍ*sdE%n%âF۟8䉳h OqDoܕn!HRNH-4L+ݚ'΢E<%-;ҍ# Jq:k""@Τ[1ҭ y,ZSa'kC&5FݱnZ,eX6k Y#ζ\n׋cFywJv|9sۼC\tw߇˦rto6zDi{w粇Ϸ觨X3_ާ[0ll{H\k/ lEѭV 4@r$Z8 l%UPcjkŢ! ֟7\T!9i Ѭ(owH.)5ٲU;gZ;oxIPNL  !0f!x Q+-ZR/T+%&`Zn B"ٝٹQȿr}^ShđRVw4DQZlӰ0Ife.GwN[թ"ZﻛʲN/z^>/A`I`u¢VgGhG5z"OUsHd&BNW-١oKIED ,E]yP@Ɯhq6#BjG6X"* ;w ~>;Qce1yk|sAg)T7gߙ T<8}oi|r( GMg ]@^! 'lG8;ts{D&'J0>ARsϔ2ɖ5m]cud$ %ݲUM>v9v"B?p~y4#Rukx,҆!=hU#T*P'MBQq'/4WmYOD|Le LbkEkIF ;z+ie=Ui_RC4*L88~9PLǫ`4c)3Mi=gg&3-S,Yx-38amБFX wї+NZtѻ!rj…j5,5Ghߏ1J[]^z}rfRn'a2 >g&^I>/>sfhxUte\iJ;EoU\ `O3BwP'cD.GYEBE| c$>M-^pJPk7R׺B݃JPx{-"gsO5Y헉ڥU䕟3M~49L~Nݚ$9As4ӜXMQ Dy*~A21.JS>A҅oqf c۫uǟ?Qwx"nMyMb vWlJt 5?)&d-&EHI+jh*{Azs\T2Ǹ `A‘'yIb _T/(F('w 9oсhD$*.Q@*.uSA@2hluJK[ƶYg,=F,b#˻R)@#K>dN"@3UF-h#*H>3zK׮ r͎k̺64,lO!}|<-8[xRLo"PlC+ .? 1Һoi7?9}_3SVDkq ,y򩂱ZS%N25* (;H޸%pFWX.6m5pQzB,lϖ|$b=VU}K`4$'e?1ϴĸg#i]+wnrEXO8_z`߳ٷ#mpfoFӔSiPL',8l>˥hiL"0f\`,&&;&diNu爛 %RxVjJ S ZQ5DJ0L}zsK*.]|oV}_*s3V `JN&);MRvZun!I%N\X=B Ht*ǣ N @#҅t񽹙T͜(ʭMtp.1H#>* b F,`EJE#^7@!r~ON(|[7'f!bT~)B ׏w6KuJa׭Ze9pKY ZGCX8Ա$|+AbKXt@A bKRFF[>]ΕqQ[ DHV% 651ZPG"%XZԂ}i]OSܩ4M}~_Yc(7Z|=K/|O—*ϛȁX15y1O$&wyqrFg0bbq~&Z/ѝGT4 MT~r':*B7]}[@xi◣6pFGz1;U|?#+C%9,.q92Ԓg5fHI^9-R3տGwu % \͵5->ίȉgN.༾~Z Q _XId9O0ːLeܨ,пH +pUh0MK،#(Tp~6*JhZ,C,q^&4D.uXӎ(t+6ً]խԫ\7y23I(ٚ5MJz{e.0@fTkϞpJ A}Y8W$~4EIq#gIZ.0U:%aSa8Y}pL;WG?Fdq 'PϓCR^()) prJ*w*h@'х F@z-uXYVc6b j(݁.F!wf*sQpPB`< A޳(Q%Kq;'qGj}yF5xSc4ƣVBJFbLonE&?0dє: kxysJ2DW>)thm ExuEgt~_;7X;%hvnv( W[kb HThhx·Ki Q Ƌ8A@!ۜFgS9b[צDE@gh(ys(":OA 'DdC38 *&%Rؼ6,Ze:YKRhi CuQT>_Cj \h-W@!#Y$ohP-7"jM QUJ&ڐ:CʇrW(Ӑ"L*܎VmMc; m:gEc;*MU)3 c7F:՜$CTX8Qd\C[$ltG}3'Pw ϹQRܭ,uȥX쌮&RX (J^ LkMLbl-v YT!/0}V+BDj47:Q2 j-rD4ǩ0:8yeJM1 K寎="6,` O1^%EW X}75QtgCWx` @1MG0뛱Я7j̀3J9oʀ{ JP#AG0ŋ|1t2$uRPWq=x(JwEU_o?knQr)4! dDaZ. RI&Bh BVkfI {:yʸK9ss~>*ya:_K= 9gwR(z'Z0A|9i^]voz(yߢPQ)v D ^..U3 ?ip^h]c}lY_g}=5*z'MOq'+;: uOyU7?x&7lGɡN~M}]>3_)ץ㳵u磾Y;4;dkͣGgl p FP\@g7 fs3vQ 1$ܘ<>FͩmyfO.Y8+o5e9Rƫ 5G.'Ϯx"*ˉ ⼕*:#A`dO t)Z'yM~W㍵=ݦ٭bWU(Y+ET"!P͆7WeЫ\Ax3v&,ߣ|t1Dh R=} ->ahZaMj 'Ԛ.I5bFM4.{(G#jιbh:W׹2k/uWKLR2p3_ЋῖG;a}xC팜EH}@>pNSySҧt1ˌ6EP>^M~OE0CRV2'/TAEDYbG7EP*J&uPuDR~]s8l ЎN(w~H8٩8Nyv*ΫNEUE> QЁ(ŽH\96 R0T'M%.(XHwHw|K>LC6y 5u"uwzWQhG7E") 5C` ) Θ{(I9ES6pG}Y'0z86lˎjK0ٓ!r^KQ#@48'5$qGi-aֳqe}00O]Pe/.$(傃2-醤4=_9NlC?*C^{F2d4DVDJ6FzP#~Ʉ39-u-jVW笐tzt\[ոX==EN _-yw7CCd,IUY?}s؉(ǢL22ٻzu;ŠW| ~ :bŝw屻w005\ۃWEf@7ư_Nn!#f99T+ Jtד 6U\)nN N:Z'˸mkœ >&\u)ܭw e)2w` \ˠ\hY#"d+H6N CDڈ b@|6LHVֵ)%;H EKȏ)wmKi•St)4^0#%%w$,G.8Tz38i@k m$Q@ 5QwM"mkxPP)WkqRVr)ף- WUV" '{VqeLFSpA \<)ӝc2}Nq0Qu&҆#VO - Z[I 5r4C|AI.ޤ41PvS .Xwc"Э (R$qg*wo*'bi2q@hMp0@|=E7Wy/ 7{'XRr)ݔ!DǥDw(Ad D <"HC/>!7&ޢ d@xrb 8l9%NCY4KnxF%C= !:Z:ģ[HgBLhq =}gn9ŇiPq;Dpߚ h&:escO@-yEH+`ɳWe{=C cLi#FAR'mQ|n!A{Z-oMm 4cReU7Y`fQŲ.Q9+*M$D!ve\QT[]NIב1ɎsHkB)EafBU%b'}.+.:=9(jSXYK!c055 Pd1oIw l?ZY/'1r2 @6 v݇3&c!dAd-Kvw}y&}U"X0nxNʭGYB/aP}q.n11Tw€bYpP;C'hD:;-ZtnsE)9#JKBbMYŚvb$.m٪HCPHxJ;am)K  /\nhP$Gm,|Y!:` b?8WфnR*&a-}(_ƒc F8bXb/jT1+ޔmWZC:do^DvXkխ6fEX>}CQ"iU'AkjƗ +6: RʴϦC]BHniX ևcezjZ74؜6,ZZ/N+@DVz'PU[Y7=J\Μ_E62z,9X:XL!ᅚ414; ͝@}c toA7V: mܯ>*2G[4ܺ5y2׵$߼kxjon>[64Ĵ>k[xN,P{K흆Y˗^iuյY]1^ep)ƵTؿ2%&R8Mb&)sNI41Xe?Y/x0|G~3a>?l7_>7K;x26raA2o!n|+aA;.v7CíDHx:9ٱgX'Rv<i\=,]Oҵq-BŴ>eWv|!.:#&qNڞb]]$뫀Yƺ6=I:ӧ'i0;? vNrw\v]4_6E^Ў-ҳVDAڍpHӊI/1}t溩O4脊H+)_J;IGgHZQP:,d缬GQmmhuZӸ]mhM^p1pR'דg&8ެg?N% 3'0_f_f_f_)޽bj1rR=K)MO(,4u8DqjS&NR L;͇~;y\UlfO0k36vamwIv[;M5Yoڜ|QX\Mf.M1H^|pd$7VYƭY20Lz 8;9|y|~J/3v}Y$<.e>o P&C :ۮ͟Jݑ( (1gvc)~b/?9Kw8ll. V޿ 伱 vN Sa,%K%0*--N 2 18O& 0"pIjRRCf[=ߵ^𓔤\sj \I!N k̷TH!:B)xSm` $If;\  ڂ&2 NyX+̙2`̩Rm`zŎH[A ڂc08T2`#jI>~pf-JOh4hk@TcKsYp 1̯ Ҧ^82.u%pd|JKP<)`)G,B)IOMR-O/[JTO= g)90a R)B@<6>`%-a>e0Xkh@#B.ikqϒ?9?.{kR"mm_pClqOYv=;6uw~4w=ʗwڠQps)¤@P/ؿ Ï`0 ?Y0سfVf[7{u c8eglMʥϝ_/A>(z.gp쬇!]7,2!( P)P3$(aȷ8vgkq;r[{$ eeox]{>Gsx͏)&BR D3O-`a~) (LRF &(l"銾U3N{Z /t~z~hu=Ԯ_,pb /Η-_,_[/nYvԀbk:\XŠH%e.?th~13(1d$f5SFSbhFaJqcr~t~r~qVPZ%_,pbE+=\XJȇow?Kp(ӎLe̹*0lS~%OQ0Mo3*ug\8x\(t 9 0T㨁 KapNJ9Cr&\ܡ.xeGLփ[!^]bZMB`pp &T fTRDZyZZ'!^*u?~Bca{Oz8R`d媋ཐʽf]'oJp?6.lCJm!WIK;.T8Fqja8pN^sDruL^lZR*#XE+ RKM*Z꽦TXc cy1f)Oh3ƬQcy1x1+"ccQ5Rѿ⒌11U1&oc}13TLzC$%^__\F'D&N^{Idp{pf Sof3ө׉2꒩M9JLSxڬ.z>΋|vSaO~1uc*V HuS 9czG=0$0ɝab vAFD43*H(j,~-H @C3ؑ폋$.oqs~>^fW14C/7ןgB}` Y:ɿÀJmC` SfJS%I:8esL%\!2ƮW"QR(NR ԃ 3 .E8m@sɽtآ$ic1%DT\Q!N%a ZR9(R$ Bd!4b D`p3@3)i ";aOt"!`nE,B!UJ )4@Iybʼn`lI,cf&JTh] Pdⱖ\RIRk.MK@]E4uI"A u,[?%=dWQ>~W;1 1,}: g{8UH>/B>9#9B2Z}gFPuyIG"MMuuUuwl.Gf)i^l~Hb|hx.QF@G>@Z/|X\aPf8<88a/?ك7e:]G1IU~1=Rge0AEY]g(FȺVhCQKՏw\W+Pψ^M'9's[b?11W _ޭWnatՄfx5 HlFWZK!x;hLV~xY ?β"ռtSUHYЁ" ᮳"O+B}9LnB*EwL_zS^ .Pd%BoM-ʴ4F5{whBp#]Jo7/N骜嗧E~խ|ojfڣG{Im{7Bzn'}WjA^gux.#Uue#zHȇ"PA{؎:׫r:B 163cr7FeyjA߿݋o됋^5Z%FLEnsx!*|ȱ-py9V(уx[IdZ0x2-Y?\y}<|eѓQ]Ѭ|?ʮwG)͸z{1)\}?ZԴ'uszl>'`{.8O|´8-Vo= p%t}Q ?ZUj\̽+Wg>jE"G8A jlC_ri~)|r>l\ӹ$Ay~ŁgƾuTCRq}zϷ \LܪPJ: n8 q$YČoe& ީHW):7_buBN)b}&eM5ྒ K3AVDTnmtZϲJ[ZK6H իr&,U΄g}#VvH[34nA#W#|=@;^@^Yöɯ~Ч/For&l&Z79DµM fS3Q^.VjrK'4ޔWjl'vgj͆dywC=dH٦9ҩV srh82R;-Btw3H`éi⩹Zqg;{nmsіn:ԃ#ZhI@jha/zNbC\4 EMać@ڰAOm-;LpXǘUzHjC koy`B˄KT~T[myՄao=D(RŲa@1F1mAa֦r`/S( 5%/t9U}"Q nGv3ծ֘touqjzWlߚe~.ҿ&뵨mo3-)s_n~!hygiƮ9}lt.wu'|utg<}oO_>٫?:>}rק՞d;`xCY{e^,gh1_x?>yI{piyCX^״(g_~|<9=, 4wyGf=cܯl{JrYzs\?ыuV_g oO>\f=/b9yWn.|w{lp{e[b}CgY/X.h$^ʟAi*K]O_ Do?z<26,Y}6'廃;1`?O'QeSMVf4 o޾h_'luE:a'[bܶw<gA*DW3Vj$ ~&JRP+} .l"!PݰUPA3<^ ,1w#9(S=} _%OBjE@mZ֨ @o[ʄ;+$CR#OM0كOQ%׮Ea.Ӗ[RL9V%JUUܔBd.N i#i%vw7;Qf~f0w/'rW_.d=I@ ~A6ヨk t+Q]|s*N`54уZJ 'M0HⳓL1ԉuSx<5E)bh;(Fxj 20R. A֝aQA A!%%sz zP+K9A/"pd9$>J;ǰ<:pD0(5Gp,*zxO50(V {詂iTtWJG0rׇ` ,_ :A'%5Af5: RSXQ/0e1Y, C-҃ԆzP/QtJF2X@OUc=:Őtܙ`Kz' 5AR:~0^FzJ`߸7ݩŠFKI F82db4SD}F g-@(Ikƌ d*  |>\Tz$RHm6{xjK!YwePE(aPjnW3U`TS@_RpY2p+!\' 0F@mhfqx_ (HZ7o%%>n 4DJ-uSk݌?.Q2 8SHUb@qDh AvkHh-8qƴkHm㋐_؋3?75zcutjZ˚z%HP'P+ TrVCL &C,ďpHM@OUp+H^Ũ 0=S.똑0$bFm ̺Khl^kHxC KF CRc[2{2oCd*e5@j1(I?Dz!6DA}'5AZzoRgL|>М C7s'e`]DkG7jNazP[_@CikdBSC혊^ dJp旞SSԂK 5z)-փR+%"5A*8FQ赵R+^; ̺H|uCP[ w5^z 1<~R_;>`mzQ2)b:iʤ3>!Sf%R=U06:~$&P[SS;Ă.~C0.ӇZ e4` TþQ|bNpjY8:73-TxWpIH~49@OUgښ۸_aeO~QRN*8aRaĄ7C_6!%(Cba>t7ݍFδ&vZ5WaE:XQwNʨ.fQDT-z#w/ W%:ʦ6Fz\J ;uQ1<9.-##=U.0NI^Q2.LUΨ jéP$$2u.0Byt@݅mlpĺ@ꇰ3U\ k'ۡDtۘ~H8B:@QH$磤Z$F9V%WZb bRtԪZH*-xϗ^u L_!Xd\.41U@k/éh)O?/B>T9P&w]F:QEք54C:{Ta˒o1T۠&Hh݁G"=UrBOHdT B"=Us j%ɵODzUtDڠ֚u!:ݢH\G__L{gn0;"Nd)dǒ"YcojxJߟwƌFܷ[bs1E9wFJXr Dfy~lg>FMg?ˮȽ-X_Dj ]Xmd׏`b5?4;ZIM7hÃp5 }~W.* o|vڛ/@({Ci/F6$k&?MzX(Ijؚ9%`YsXԠ?=WdQzu081RQMpԗփ$PC-y)Sի'(\ofM]=x% '?__?O=<ꊆ02|| >N)4Oo {A3lX)Ɔ:/%Pb9B!cdeQy8 }dJ"+K|)A a}jU8[lMryS] $zajz%04:G*qOR qaQFtפ=<j䏔88@2 Ր-޻EWofTPEgwzdύ2ZQ6R-7yxW`|ˌ-Gn~3 $ãջׅ+E< h> Uu^&ɬkȎF,/30|aͧMzǽ|(\V^U 㻘dQMI;0=09Z-kS1s3UvڻDmVK:\̃XlRf}>I!rƨ:z]O? II%܈&[Y +Wf+)L2F+Tmy?_ [5M^>\p,>?;b^e,Ӕ?j~a~q l*ff0wkU`f[S8P] 'T73ȏnVwW߾1b<G]FX^Cָb,y6Y%7_IL!Not-&g(g8sØzڇN(d'#I xaC00 (&E8}NMfMP} whYq6L@Ϊ4=og:Zv4dnN"H/o]E=өö zLkh얐?8{> k) *X 3QRJ!)Qٖw\!_>!7I{$ `,R[X$ ld^S+-+֥/ ҉DTDS`Q1X^"M-.qeiH=̂rpBD1A;mCM\pt0"|sYqlڴs ׺( J6?5ٱpac,ϒJbFDNQ%19N4R|$oyIF&}Y7N%$}NH?|-AZ "..Fv7U4(>]V`5X}(:ρDюzZFm?/G>6fy1FX+%26+Ǽ6;/pX0;{X'ɞOQ~)HGcċVkE⚯G.z c"0P.zkvY8Kb[s&=~\ӏIǫBwn܈ԥ[xJ)J%vx p \JʔNbxnk֚\$7ʲ({P*5N!eINM{D qSt@ےKKrj ᵓDdyIPHά82+FUW#6*X!Q(biB%L QHAK0`B[}5"1b%8W .h=IK"=#F1FBJa)=SϬFʔk<Q"InNM)p3TK]6gi= L8V3Z cXN11cSR;\hx":h'rt}dJJjniD_φ)$RwfywiUOzݕlTew߷ cwwNN'PWj)EyƔǙ)Ί Yh NcLf6f{7Kgt.--oO!Eq10E1g;6$xy =Wup3gKaISNE;Jg{^R;4|fHe/Z(1Nm# 1_[=F;LaIL>FbJɱRLP^*/1^JEQELIDa5ȕR8^coEvt~&G|2)N > F.e)DX%If񥅯#‘'8x)-ʻ4<%AX™>P1Ep9 tDZ s½e)e:G]$FyK@Ъe?ʄVXhV֦y Pd$9ک-J*aJ(1IV3rH0* ^б^2$jv;8kۍ>Mi&/<k76Ei;>0x)™va8cլ7fq(sTd! ݆#eOqCαֻ:ņ?kp ^ {7}a[._?B8jѮp @1+3`L1[+<ryC!JJք45@IHjXsW(%_6FUcZ+?Bjغ35Zhf;U:J$S`ѲZicf` QDj_)T͹@9Z]7._G,IYa+qP@EsM e*] 1yI=TZc|;pK4Y:*-(ʸκL{%xΈ~}?v YO SQڰ: IPlL<&t0ͨ3  2]Pt B(AγYCA`v8XY=*øY2NQ o0n})d8|Ʋ!(Nj@3վ5]x{Ef}[{pDZZ'W f6<SFo# bl:8?5綝luG>X>4O>T6p6pl6cpݞ.aP˒z Wו}$ y&bSwy3n- }wKاۖhd:^ѻŰ7nA68"NWr=) [(.OS٫`ql^ѻŰ7nmiŀZZ'$%{#u?M;t߯ l]"b؞#b6@3p-t|e t: W~pG;|=E;mpR+ e;MOtrz4C .]v mXM(n2_]-Z=8GL|y}yv9_̊իwڹ?_{x9'Wjr4zn)3̀dJ#etCeZ$ӏ8w[,8=N[6S&佸F,{ $)3#GSh-HǒyDhmdTo{Q_nc38a_{9< Tk'0v%?E/?%߇J:~!U.V65)=ƣ)q=(hQ0q8xjQ+jNPC9p9 } j>ݲDŀ';A OPcX7 dwCFB9p1wx^B3Jy-q 7YFލQxP\L'>픫9I݂bX7uB#q)NY(ctd\6l%mv1~2]֋GDv1/9@3V9'<&9#풵'/]L6\To]|ov,=,ay<+ή;?B+ϻԓ6Y~3Yw'Z_frQ7;7b^~@`Rb'П=N7o3<{$`ǦN^^~6ZRzJuL9榙*Y P΁bKW`7aϜ蚐rP蛧pmAMLJ;֘5c vyj7݁[AAQ -9ϛ kcH^AKybDCAF]RF V<~Mv|}͍="ްKg]</SlkfI6NZcvf37A~[<;l@@̓^ӅC(Q_VT`kP`U*Jkt^* x!s…RV$7y{^ EK|JL4n}e+MUJjA#,RBoᄏm\4~Z]wn;Q3dFZa'dPڪqʋn^_^׏7ᚁZ&E-A,d(jNx jyA>PK 5 ?2%jnAuJe8j洙iˀPˢ{[?^|6?v]ԺohQzw%6XgWuu1ieWwY>.>+|FсXo9һz=pD㗄h\*5MHJԂPm⑞w0>5P#d8p:hTvm$\Ǎ\ +t:5͈4ؽ)IqkS#D@ IQ30` GeZԦxj0jԮQ顕'|>z ջxÇ &i "zZд[˛Y}^ϗ{y>oX/ڟV'\oƒ<2("xg2b<|Ê kŔ _fgpz8\s:|4  xx!DR  a:-'1m!ZRUTn",N: u]idiZ 0HԇeE̅Q sD2D# #BT(rMH5eBQ&Zig"p-#ka1TGT#jI-c!D5wFK%7u8D8ڵw#3 :a FMer, oW8Di H>"Bj i&J޼TPYWFh6u-_4E' ԤeCAN}bL_ ~9ʱ{;Msė\4zhP4ju5owq|x,IO .Nܻ8Qk Rf g(Z2#$)jŸp2-׆pc"/ԆP*U:)jAFZ'kpr#?k IG@wX8\dD5KZ>˄KqPSU2)jJ Zqb"iJ#IQ[egUZ \GqC-Hu Rő/rl<.)=w%e9}=mnr(KFJJDEI5F45MmL#?Ĕ/#Z},G5 $| >._[c_NMf~y-\z9g! ԌIS0w ;KoxC={{.|6j="a(.)B~@Ӱ;apC]ư7nA6%}w3w%9]9pJ'g[,5k y&:ئj YXe{)>ut;ĺ|GJUb})r4s{ FL?^u:y^uiXuzصk#JW)أ4#@$, map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:ProbeError,Message:Startup probe error: Get "https://192.168.126.11:10357/healthz": context deadline exceeded (Client.Timeout exceeded while awaiting headers) Feb 24 10:09:01 crc kubenswrapper[4985]: body: Feb 24 10:09:01 crc kubenswrapper[4985]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-24 10:08:49.252545894 +0000 UTC m=+13.726738494,LastTimestamp:2026-02-24 10:08:49.252545894 +0000 UTC m=+13.726738494,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Feb 24 10:09:01 crc kubenswrapper[4985]: > Feb 24 10:09:01 crc kubenswrapper[4985]: E0224 10:09:01.256791 4985 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189726e1b5c4dd3d openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Unhealthy,Message:Startup probe failed: Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers),Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-24 10:08:49.252654397 +0000 UTC m=+13.726846997,LastTimestamp:2026-02-24 10:08:49.252654397 +0000 UTC m=+13.726846997,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 24 10:09:01 crc kubenswrapper[4985]: E0224 10:09:01.262157 4985 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event=< Feb 24 10:09:01 crc kubenswrapper[4985]: &Event{ObjectMeta:{kube-apiserver-crc.189726e20dba5c06 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:ProbeError,Message:Startup probe error: HTTP probe failed with statuscode: 403 Feb 24 10:09:01 crc kubenswrapper[4985]: body: {"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Feb 24 10:09:01 crc kubenswrapper[4985]: Feb 24 10:09:01 crc kubenswrapper[4985]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-24 10:08:50.728360966 +0000 UTC m=+15.202553536,LastTimestamp:2026-02-24 10:08:50.728360966 +0000 UTC m=+15.202553536,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Feb 24 10:09:01 crc kubenswrapper[4985]: > Feb 24 10:09:01 crc kubenswrapper[4985]: E0224 10:09:01.266510 4985 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189726e20dbb4b49 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Unhealthy,Message:Startup probe failed: HTTP probe failed with statuscode: 403,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-24 10:08:50.728422217 +0000 UTC m=+15.202614777,LastTimestamp:2026-02-24 10:08:50.728422217 +0000 UTC m=+15.202614777,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 24 10:09:01 crc kubenswrapper[4985]: E0224 10:09:01.270181 4985 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-apiserver-crc.189726e20dba5c06\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event=< Feb 24 10:09:01 crc kubenswrapper[4985]: &Event{ObjectMeta:{kube-apiserver-crc.189726e20dba5c06 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:ProbeError,Message:Startup probe error: HTTP probe failed with statuscode: 403 Feb 24 10:09:01 crc kubenswrapper[4985]: body: {"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Feb 24 10:09:01 crc kubenswrapper[4985]: Feb 24 10:09:01 crc kubenswrapper[4985]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-24 10:08:50.728360966 +0000 UTC m=+15.202553536,LastTimestamp:2026-02-24 10:08:50.734070486 +0000 UTC m=+15.208263046,Count:2,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Feb 24 10:09:01 crc kubenswrapper[4985]: > Feb 24 10:09:01 crc kubenswrapper[4985]: E0224 10:09:01.275789 4985 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-apiserver-crc.189726e20dbb4b49\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189726e20dbb4b49 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Unhealthy,Message:Startup probe failed: HTTP probe failed with statuscode: 403,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-24 10:08:50.728422217 +0000 UTC m=+15.202614777,LastTimestamp:2026-02-24 10:08:50.734117158 +0000 UTC m=+15.208309718,Count:2,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 24 10:09:01 crc kubenswrapper[4985]: E0224 10:09:01.280666 4985 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-apiserver-crc.189726df5b6d0bc7\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189726df5b6d0bc7 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-24 10:08:39.147015111 +0000 UTC m=+3.621207671,LastTimestamp:2026-02-24 10:08:51.380157665 +0000 UTC m=+15.854350225,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 24 10:09:01 crc kubenswrapper[4985]: E0224 10:09:01.284848 4985 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-apiserver-crc.189726df66314ebd\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189726df66314ebd openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Created,Message:Created container kube-apiserver-check-endpoints,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-24 10:08:39.327649469 +0000 UTC m=+3.801842019,LastTimestamp:2026-02-24 10:08:51.563354106 +0000 UTC m=+16.037546666,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 24 10:09:01 crc kubenswrapper[4985]: E0224 10:09:01.288607 4985 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-apiserver-crc.189726df66d5ed50\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189726df66d5ed50 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Started,Message:Started container kube-apiserver-check-endpoints,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-24 10:08:39.338437968 +0000 UTC m=+3.812630528,LastTimestamp:2026-02-24 10:08:51.570754165 +0000 UTC m=+16.044946725,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 24 10:09:01 crc kubenswrapper[4985]: E0224 10:09:01.293722 4985 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event=< Feb 24 10:09:01 crc kubenswrapper[4985]: &Event{ObjectMeta:{kube-controller-manager-crc.189726e409d2f331 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:ProbeError,Message:Startup probe error: Get "https://192.168.126.11:10357/healthz": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers) Feb 24 10:09:01 crc kubenswrapper[4985]: body: Feb 24 10:09:01 crc kubenswrapper[4985]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-24 10:08:59.252798257 +0000 UTC m=+23.726990827,LastTimestamp:2026-02-24 10:08:59.252798257 +0000 UTC m=+23.726990827,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Feb 24 10:09:01 crc kubenswrapper[4985]: > Feb 24 10:09:01 crc kubenswrapper[4985]: E0224 10:09:01.297514 4985 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189726e409d3fb1b openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Unhealthy,Message:Startup probe failed: Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers),Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-24 10:08:59.252865819 +0000 UTC m=+23.727058379,LastTimestamp:2026-02-24 10:08:59.252865819 +0000 UTC m=+23.727058379,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 24 10:09:02 crc kubenswrapper[4985]: I0224 10:09:02.189613 4985 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 24 10:09:03 crc kubenswrapper[4985]: I0224 10:09:03.193586 4985 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 24 10:09:04 crc kubenswrapper[4985]: I0224 10:09:04.126046 4985 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 24 10:09:04 crc kubenswrapper[4985]: I0224 10:09:04.129113 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:09:04 crc kubenswrapper[4985]: I0224 10:09:04.129172 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:09:04 crc kubenswrapper[4985]: I0224 10:09:04.129196 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:09:04 crc kubenswrapper[4985]: I0224 10:09:04.129238 4985 kubelet_node_status.go:76] "Attempting to register node" node="crc" Feb 24 10:09:04 crc kubenswrapper[4985]: E0224 10:09:04.132046 4985 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Feb 24 10:09:04 crc kubenswrapper[4985]: E0224 10:09:04.132195 4985 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Feb 24 10:09:04 crc kubenswrapper[4985]: I0224 10:09:04.190142 4985 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 24 10:09:05 crc kubenswrapper[4985]: I0224 10:09:05.193081 4985 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 24 10:09:05 crc kubenswrapper[4985]: W0224 10:09:05.595907 4985 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: services is forbidden: User "system:anonymous" cannot list resource "services" in API group "" at the cluster scope Feb 24 10:09:05 crc kubenswrapper[4985]: E0224 10:09:05.595989 4985 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" Feb 24 10:09:05 crc kubenswrapper[4985]: W0224 10:09:05.725622 4985 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: nodes "crc" is forbidden: User "system:anonymous" cannot list resource "nodes" in API group "" at the cluster scope Feb 24 10:09:05 crc kubenswrapper[4985]: E0224 10:09:05.725679 4985 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: nodes \"crc\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" Feb 24 10:09:06 crc kubenswrapper[4985]: I0224 10:09:06.189262 4985 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 24 10:09:06 crc kubenswrapper[4985]: E0224 10:09:06.347113 4985 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Feb 24 10:09:07 crc kubenswrapper[4985]: I0224 10:09:07.191562 4985 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 24 10:09:08 crc kubenswrapper[4985]: I0224 10:09:08.191474 4985 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 24 10:09:08 crc kubenswrapper[4985]: I0224 10:09:08.263989 4985 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 24 10:09:08 crc kubenswrapper[4985]: I0224 10:09:08.265247 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:09:08 crc kubenswrapper[4985]: I0224 10:09:08.265309 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:09:08 crc kubenswrapper[4985]: I0224 10:09:08.265330 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:09:08 crc kubenswrapper[4985]: I0224 10:09:08.266260 4985 scope.go:117] "RemoveContainer" containerID="2238da5b5e4c426c15215f83d466f4ca89a264c1eeb2ca37b5867f66821bb232" Feb 24 10:09:08 crc kubenswrapper[4985]: W0224 10:09:08.810975 4985 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: runtimeclasses.node.k8s.io is forbidden: User "system:anonymous" cannot list resource "runtimeclasses" in API group "node.k8s.io" at the cluster scope Feb 24 10:09:08 crc kubenswrapper[4985]: E0224 10:09:08.811030 4985 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: runtimeclasses.node.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"runtimeclasses\" in API group \"node.k8s.io\" at the cluster scope" logger="UnhandledError" Feb 24 10:09:09 crc kubenswrapper[4985]: I0224 10:09:09.030831 4985 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": read tcp 192.168.126.11:43910->192.168.126.11:10357: read: connection reset by peer" start-of-body= Feb 24 10:09:09 crc kubenswrapper[4985]: I0224 10:09:09.030922 4985 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": read tcp 192.168.126.11:43910->192.168.126.11:10357: read: connection reset by peer" Feb 24 10:09:09 crc kubenswrapper[4985]: I0224 10:09:09.030975 4985 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 24 10:09:09 crc kubenswrapper[4985]: I0224 10:09:09.031125 4985 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 24 10:09:09 crc kubenswrapper[4985]: I0224 10:09:09.032315 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:09:09 crc kubenswrapper[4985]: I0224 10:09:09.032344 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:09:09 crc kubenswrapper[4985]: I0224 10:09:09.032356 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:09:09 crc kubenswrapper[4985]: I0224 10:09:09.032878 4985 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="cluster-policy-controller" containerStatusID={"Type":"cri-o","ID":"80172a23d36c0c1ea92ff824608237fed22c6e46e43ac33db31a9aff8d181b2b"} pod="openshift-kube-controller-manager/kube-controller-manager-crc" containerMessage="Container cluster-policy-controller failed startup probe, will be restarted" Feb 24 10:09:09 crc kubenswrapper[4985]: I0224 10:09:09.033071 4985 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" containerID="cri-o://80172a23d36c0c1ea92ff824608237fed22c6e46e43ac33db31a9aff8d181b2b" gracePeriod=30 Feb 24 10:09:09 crc kubenswrapper[4985]: E0224 10:09:09.043438 4985 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event=< Feb 24 10:09:09 crc kubenswrapper[4985]: &Event{ObjectMeta:{kube-controller-manager-crc.189726e650a4fc6e openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:ProbeError,Message:Startup probe error: Get "https://192.168.126.11:10357/healthz": read tcp 192.168.126.11:43910->192.168.126.11:10357: read: connection reset by peer Feb 24 10:09:09 crc kubenswrapper[4985]: body: Feb 24 10:09:09 crc kubenswrapper[4985]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-24 10:09:09.030902894 +0000 UTC m=+33.505095454,LastTimestamp:2026-02-24 10:09:09.030902894 +0000 UTC m=+33.505095454,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Feb 24 10:09:09 crc kubenswrapper[4985]: > Feb 24 10:09:09 crc kubenswrapper[4985]: E0224 10:09:09.045364 4985 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189726e650a5a486 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Unhealthy,Message:Startup probe failed: Get \"https://192.168.126.11:10357/healthz\": read tcp 192.168.126.11:43910->192.168.126.11:10357: read: connection reset by peer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-24 10:09:09.030945926 +0000 UTC m=+33.505138486,LastTimestamp:2026-02-24 10:09:09.030945926 +0000 UTC m=+33.505138486,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 24 10:09:09 crc kubenswrapper[4985]: E0224 10:09:09.051309 4985 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189726e650c5c16a openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Killing,Message:Container cluster-policy-controller failed startup probe, will be restarted,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-24 10:09:09.033050474 +0000 UTC m=+33.507243054,LastTimestamp:2026-02-24 10:09:09.033050474 +0000 UTC m=+33.507243054,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 24 10:09:09 crc kubenswrapper[4985]: E0224 10:09:09.062641 4985 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189726def4a0570b\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189726def4a0570b openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-24 10:08:37.422323467 +0000 UTC m=+1.896516037,LastTimestamp:2026-02-24 10:09:09.057273657 +0000 UTC m=+33.531466217,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 24 10:09:09 crc kubenswrapper[4985]: I0224 10:09:09.191433 4985 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 24 10:09:09 crc kubenswrapper[4985]: E0224 10:09:09.288144 4985 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189726df0a4b7531\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189726df0a4b7531 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Created,Message:Created container cluster-policy-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-24 10:08:37.785859377 +0000 UTC m=+2.260051937,LastTimestamp:2026-02-24 10:09:09.282191622 +0000 UTC m=+33.756384222,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 24 10:09:09 crc kubenswrapper[4985]: E0224 10:09:09.303870 4985 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189726df0adc8f50\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189726df0adc8f50 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Started,Message:Started container cluster-policy-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-24 10:08:37.795368784 +0000 UTC m=+2.269561344,LastTimestamp:2026-02-24 10:09:09.296468274 +0000 UTC m=+33.770660834,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 24 10:09:09 crc kubenswrapper[4985]: I0224 10:09:09.454045 4985 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/0.log" Feb 24 10:09:09 crc kubenswrapper[4985]: I0224 10:09:09.454536 4985 generic.go:334] "Generic (PLEG): container finished" podID="f614b9022728cf315e60c057852e563e" containerID="80172a23d36c0c1ea92ff824608237fed22c6e46e43ac33db31a9aff8d181b2b" exitCode=255 Feb 24 10:09:09 crc kubenswrapper[4985]: I0224 10:09:09.454649 4985 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerDied","Data":"80172a23d36c0c1ea92ff824608237fed22c6e46e43ac33db31a9aff8d181b2b"} Feb 24 10:09:09 crc kubenswrapper[4985]: I0224 10:09:09.454749 4985 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"a122488c66088f525d614bb431c3211b6c4cc2e607703b5b4ab70ed998df6bfd"} Feb 24 10:09:09 crc kubenswrapper[4985]: I0224 10:09:09.454927 4985 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 24 10:09:09 crc kubenswrapper[4985]: I0224 10:09:09.456324 4985 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/2.log" Feb 24 10:09:09 crc kubenswrapper[4985]: I0224 10:09:09.456653 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:09:09 crc kubenswrapper[4985]: I0224 10:09:09.456687 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:09:09 crc kubenswrapper[4985]: I0224 10:09:09.456697 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:09:09 crc kubenswrapper[4985]: I0224 10:09:09.456861 4985 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Feb 24 10:09:09 crc kubenswrapper[4985]: I0224 10:09:09.458619 4985 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="cf633cddf4ec92069e7180462c33ade3659c28c175bd59b4980042260486339c" exitCode=255 Feb 24 10:09:09 crc kubenswrapper[4985]: I0224 10:09:09.458663 4985 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"cf633cddf4ec92069e7180462c33ade3659c28c175bd59b4980042260486339c"} Feb 24 10:09:09 crc kubenswrapper[4985]: I0224 10:09:09.458700 4985 scope.go:117] "RemoveContainer" containerID="2238da5b5e4c426c15215f83d466f4ca89a264c1eeb2ca37b5867f66821bb232" Feb 24 10:09:09 crc kubenswrapper[4985]: I0224 10:09:09.458827 4985 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 24 10:09:09 crc kubenswrapper[4985]: I0224 10:09:09.459877 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:09:09 crc kubenswrapper[4985]: I0224 10:09:09.459943 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:09:09 crc kubenswrapper[4985]: I0224 10:09:09.459963 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:09:09 crc kubenswrapper[4985]: I0224 10:09:09.460846 4985 scope.go:117] "RemoveContainer" containerID="cf633cddf4ec92069e7180462c33ade3659c28c175bd59b4980042260486339c" Feb 24 10:09:09 crc kubenswrapper[4985]: E0224 10:09:09.461637 4985 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Feb 24 10:09:10 crc kubenswrapper[4985]: I0224 10:09:10.191480 4985 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 24 10:09:10 crc kubenswrapper[4985]: I0224 10:09:10.463594 4985 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/2.log" Feb 24 10:09:11 crc kubenswrapper[4985]: I0224 10:09:11.031029 4985 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 24 10:09:11 crc kubenswrapper[4985]: I0224 10:09:11.031245 4985 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 24 10:09:11 crc kubenswrapper[4985]: I0224 10:09:11.032412 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:09:11 crc kubenswrapper[4985]: I0224 10:09:11.032472 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:09:11 crc kubenswrapper[4985]: I0224 10:09:11.032496 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:09:11 crc kubenswrapper[4985]: I0224 10:09:11.133233 4985 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 24 10:09:11 crc kubenswrapper[4985]: I0224 10:09:11.134909 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:09:11 crc kubenswrapper[4985]: I0224 10:09:11.134957 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:09:11 crc kubenswrapper[4985]: I0224 10:09:11.134976 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:09:11 crc kubenswrapper[4985]: I0224 10:09:11.135015 4985 kubelet_node_status.go:76] "Attempting to register node" node="crc" Feb 24 10:09:11 crc kubenswrapper[4985]: E0224 10:09:11.137792 4985 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Feb 24 10:09:11 crc kubenswrapper[4985]: E0224 10:09:11.138336 4985 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Feb 24 10:09:11 crc kubenswrapper[4985]: I0224 10:09:11.191265 4985 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 24 10:09:12 crc kubenswrapper[4985]: I0224 10:09:12.192528 4985 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 24 10:09:12 crc kubenswrapper[4985]: I0224 10:09:12.746958 4985 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 24 10:09:12 crc kubenswrapper[4985]: I0224 10:09:12.747236 4985 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 24 10:09:12 crc kubenswrapper[4985]: I0224 10:09:12.748726 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:09:12 crc kubenswrapper[4985]: I0224 10:09:12.748817 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:09:12 crc kubenswrapper[4985]: I0224 10:09:12.748837 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:09:12 crc kubenswrapper[4985]: I0224 10:09:12.749982 4985 scope.go:117] "RemoveContainer" containerID="cf633cddf4ec92069e7180462c33ade3659c28c175bd59b4980042260486339c" Feb 24 10:09:12 crc kubenswrapper[4985]: E0224 10:09:12.750285 4985 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Feb 24 10:09:13 crc kubenswrapper[4985]: I0224 10:09:13.190261 4985 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 24 10:09:14 crc kubenswrapper[4985]: I0224 10:09:14.189985 4985 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 24 10:09:15 crc kubenswrapper[4985]: I0224 10:09:15.191215 4985 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 24 10:09:15 crc kubenswrapper[4985]: W0224 10:09:15.210023 4985 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User "system:anonymous" cannot list resource "csidrivers" in API group "storage.k8s.io" at the cluster scope Feb 24 10:09:15 crc kubenswrapper[4985]: E0224 10:09:15.210247 4985 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"csidrivers\" in API group \"storage.k8s.io\" at the cluster scope" logger="UnhandledError" Feb 24 10:09:15 crc kubenswrapper[4985]: I0224 10:09:15.896561 4985 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 24 10:09:15 crc kubenswrapper[4985]: I0224 10:09:15.896875 4985 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 24 10:09:15 crc kubenswrapper[4985]: I0224 10:09:15.898262 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:09:15 crc kubenswrapper[4985]: I0224 10:09:15.898308 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:09:15 crc kubenswrapper[4985]: I0224 10:09:15.898320 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:09:15 crc kubenswrapper[4985]: I0224 10:09:15.898926 4985 scope.go:117] "RemoveContainer" containerID="cf633cddf4ec92069e7180462c33ade3659c28c175bd59b4980042260486339c" Feb 24 10:09:15 crc kubenswrapper[4985]: E0224 10:09:15.899105 4985 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Feb 24 10:09:16 crc kubenswrapper[4985]: I0224 10:09:16.191594 4985 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 24 10:09:16 crc kubenswrapper[4985]: I0224 10:09:16.252304 4985 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 24 10:09:16 crc kubenswrapper[4985]: I0224 10:09:16.252724 4985 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 24 10:09:16 crc kubenswrapper[4985]: I0224 10:09:16.253781 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:09:16 crc kubenswrapper[4985]: I0224 10:09:16.253905 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:09:16 crc kubenswrapper[4985]: I0224 10:09:16.253989 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:09:16 crc kubenswrapper[4985]: E0224 10:09:16.347245 4985 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Feb 24 10:09:17 crc kubenswrapper[4985]: I0224 10:09:17.190545 4985 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 24 10:09:18 crc kubenswrapper[4985]: I0224 10:09:18.139087 4985 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 24 10:09:18 crc kubenswrapper[4985]: I0224 10:09:18.140862 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:09:18 crc kubenswrapper[4985]: I0224 10:09:18.140964 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:09:18 crc kubenswrapper[4985]: I0224 10:09:18.140983 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:09:18 crc kubenswrapper[4985]: I0224 10:09:18.141025 4985 kubelet_node_status.go:76] "Attempting to register node" node="crc" Feb 24 10:09:18 crc kubenswrapper[4985]: E0224 10:09:18.146432 4985 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Feb 24 10:09:18 crc kubenswrapper[4985]: E0224 10:09:18.146743 4985 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Feb 24 10:09:18 crc kubenswrapper[4985]: I0224 10:09:18.187263 4985 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 24 10:09:19 crc kubenswrapper[4985]: I0224 10:09:19.190575 4985 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 24 10:09:19 crc kubenswrapper[4985]: I0224 10:09:19.252467 4985 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Feb 24 10:09:19 crc kubenswrapper[4985]: I0224 10:09:19.252528 4985 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Feb 24 10:09:19 crc kubenswrapper[4985]: E0224 10:09:19.258600 4985 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189726e409d2f331\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event=< Feb 24 10:09:19 crc kubenswrapper[4985]: &Event{ObjectMeta:{kube-controller-manager-crc.189726e409d2f331 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:ProbeError,Message:Startup probe error: Get "https://192.168.126.11:10357/healthz": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers) Feb 24 10:09:19 crc kubenswrapper[4985]: body: Feb 24 10:09:19 crc kubenswrapper[4985]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-24 10:08:59.252798257 +0000 UTC m=+23.726990827,LastTimestamp:2026-02-24 10:09:19.252506793 +0000 UTC m=+43.726699353,Count:2,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Feb 24 10:09:19 crc kubenswrapper[4985]: > Feb 24 10:09:19 crc kubenswrapper[4985]: E0224 10:09:19.262301 4985 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189726e409d3fb1b\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189726e409d3fb1b openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Unhealthy,Message:Startup probe failed: Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers),Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-24 10:08:59.252865819 +0000 UTC m=+23.727058379,LastTimestamp:2026-02-24 10:09:19.252552094 +0000 UTC m=+43.726744654,Count:2,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 24 10:09:20 crc kubenswrapper[4985]: I0224 10:09:20.190927 4985 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 24 10:09:21 crc kubenswrapper[4985]: I0224 10:09:21.191450 4985 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 24 10:09:22 crc kubenswrapper[4985]: I0224 10:09:22.191751 4985 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 24 10:09:22 crc kubenswrapper[4985]: W0224 10:09:22.461375 4985 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: nodes "crc" is forbidden: User "system:anonymous" cannot list resource "nodes" in API group "" at the cluster scope Feb 24 10:09:22 crc kubenswrapper[4985]: E0224 10:09:22.461428 4985 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: nodes \"crc\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" Feb 24 10:09:23 crc kubenswrapper[4985]: I0224 10:09:23.189371 4985 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 24 10:09:24 crc kubenswrapper[4985]: I0224 10:09:24.192054 4985 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 24 10:09:25 crc kubenswrapper[4985]: I0224 10:09:25.147394 4985 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 24 10:09:25 crc kubenswrapper[4985]: I0224 10:09:25.149201 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:09:25 crc kubenswrapper[4985]: I0224 10:09:25.149241 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:09:25 crc kubenswrapper[4985]: I0224 10:09:25.149252 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:09:25 crc kubenswrapper[4985]: I0224 10:09:25.149282 4985 kubelet_node_status.go:76] "Attempting to register node" node="crc" Feb 24 10:09:25 crc kubenswrapper[4985]: E0224 10:09:25.154118 4985 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Feb 24 10:09:25 crc kubenswrapper[4985]: E0224 10:09:25.154164 4985 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Feb 24 10:09:25 crc kubenswrapper[4985]: I0224 10:09:25.188062 4985 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 24 10:09:25 crc kubenswrapper[4985]: W0224 10:09:25.385551 4985 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: services is forbidden: User "system:anonymous" cannot list resource "services" in API group "" at the cluster scope Feb 24 10:09:25 crc kubenswrapper[4985]: E0224 10:09:25.385621 4985 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" Feb 24 10:09:26 crc kubenswrapper[4985]: I0224 10:09:26.189353 4985 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 24 10:09:26 crc kubenswrapper[4985]: I0224 10:09:26.256344 4985 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 24 10:09:26 crc kubenswrapper[4985]: I0224 10:09:26.256488 4985 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 24 10:09:26 crc kubenswrapper[4985]: I0224 10:09:26.257564 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:09:26 crc kubenswrapper[4985]: I0224 10:09:26.257624 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:09:26 crc kubenswrapper[4985]: I0224 10:09:26.257642 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:09:26 crc kubenswrapper[4985]: I0224 10:09:26.259883 4985 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 24 10:09:26 crc kubenswrapper[4985]: E0224 10:09:26.347980 4985 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Feb 24 10:09:26 crc kubenswrapper[4985]: I0224 10:09:26.506538 4985 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 24 10:09:26 crc kubenswrapper[4985]: I0224 10:09:26.507310 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:09:26 crc kubenswrapper[4985]: I0224 10:09:26.507369 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:09:26 crc kubenswrapper[4985]: I0224 10:09:26.507383 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:09:27 crc kubenswrapper[4985]: I0224 10:09:27.189969 4985 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 24 10:09:28 crc kubenswrapper[4985]: I0224 10:09:28.190435 4985 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 24 10:09:28 crc kubenswrapper[4985]: W0224 10:09:28.731555 4985 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: runtimeclasses.node.k8s.io is forbidden: User "system:anonymous" cannot list resource "runtimeclasses" in API group "node.k8s.io" at the cluster scope Feb 24 10:09:28 crc kubenswrapper[4985]: E0224 10:09:28.731625 4985 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: runtimeclasses.node.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"runtimeclasses\" in API group \"node.k8s.io\" at the cluster scope" logger="UnhandledError" Feb 24 10:09:29 crc kubenswrapper[4985]: I0224 10:09:29.190404 4985 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 24 10:09:30 crc kubenswrapper[4985]: I0224 10:09:30.190308 4985 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 24 10:09:31 crc kubenswrapper[4985]: I0224 10:09:31.190381 4985 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 24 10:09:31 crc kubenswrapper[4985]: I0224 10:09:31.264330 4985 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 24 10:09:31 crc kubenswrapper[4985]: I0224 10:09:31.265016 4985 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Feb 24 10:09:31 crc kubenswrapper[4985]: I0224 10:09:31.265115 4985 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 24 10:09:31 crc kubenswrapper[4985]: I0224 10:09:31.265951 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:09:31 crc kubenswrapper[4985]: I0224 10:09:31.265986 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:09:31 crc kubenswrapper[4985]: I0224 10:09:31.265997 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:09:31 crc kubenswrapper[4985]: I0224 10:09:31.266146 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:09:31 crc kubenswrapper[4985]: I0224 10:09:31.266204 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:09:31 crc kubenswrapper[4985]: I0224 10:09:31.266217 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:09:31 crc kubenswrapper[4985]: I0224 10:09:31.266975 4985 scope.go:117] "RemoveContainer" containerID="cf633cddf4ec92069e7180462c33ade3659c28c175bd59b4980042260486339c" Feb 24 10:09:31 crc kubenswrapper[4985]: I0224 10:09:31.521879 4985 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/2.log" Feb 24 10:09:31 crc kubenswrapper[4985]: I0224 10:09:31.525309 4985 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"6ee8b871ab25b79ee67f4e5735673469d19cfb336696170957d2439b0bca13af"} Feb 24 10:09:31 crc kubenswrapper[4985]: I0224 10:09:31.525484 4985 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 24 10:09:31 crc kubenswrapper[4985]: I0224 10:09:31.526487 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:09:31 crc kubenswrapper[4985]: I0224 10:09:31.526536 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:09:31 crc kubenswrapper[4985]: I0224 10:09:31.526550 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:09:32 crc kubenswrapper[4985]: I0224 10:09:32.155073 4985 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 24 10:09:32 crc kubenswrapper[4985]: I0224 10:09:32.156485 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:09:32 crc kubenswrapper[4985]: I0224 10:09:32.156543 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:09:32 crc kubenswrapper[4985]: I0224 10:09:32.156561 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:09:32 crc kubenswrapper[4985]: I0224 10:09:32.156595 4985 kubelet_node_status.go:76] "Attempting to register node" node="crc" Feb 24 10:09:32 crc kubenswrapper[4985]: E0224 10:09:32.159858 4985 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Feb 24 10:09:32 crc kubenswrapper[4985]: E0224 10:09:32.160424 4985 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Feb 24 10:09:32 crc kubenswrapper[4985]: I0224 10:09:32.187603 4985 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 24 10:09:32 crc kubenswrapper[4985]: I0224 10:09:32.530132 4985 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/3.log" Feb 24 10:09:32 crc kubenswrapper[4985]: I0224 10:09:32.530744 4985 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/2.log" Feb 24 10:09:32 crc kubenswrapper[4985]: I0224 10:09:32.532741 4985 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="6ee8b871ab25b79ee67f4e5735673469d19cfb336696170957d2439b0bca13af" exitCode=255 Feb 24 10:09:32 crc kubenswrapper[4985]: I0224 10:09:32.532784 4985 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"6ee8b871ab25b79ee67f4e5735673469d19cfb336696170957d2439b0bca13af"} Feb 24 10:09:32 crc kubenswrapper[4985]: I0224 10:09:32.532830 4985 scope.go:117] "RemoveContainer" containerID="cf633cddf4ec92069e7180462c33ade3659c28c175bd59b4980042260486339c" Feb 24 10:09:32 crc kubenswrapper[4985]: I0224 10:09:32.533035 4985 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 24 10:09:32 crc kubenswrapper[4985]: I0224 10:09:32.533922 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:09:32 crc kubenswrapper[4985]: I0224 10:09:32.533971 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:09:32 crc kubenswrapper[4985]: I0224 10:09:32.533988 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:09:32 crc kubenswrapper[4985]: I0224 10:09:32.534943 4985 scope.go:117] "RemoveContainer" containerID="6ee8b871ab25b79ee67f4e5735673469d19cfb336696170957d2439b0bca13af" Feb 24 10:09:32 crc kubenswrapper[4985]: E0224 10:09:32.535225 4985 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Feb 24 10:09:32 crc kubenswrapper[4985]: I0224 10:09:32.746851 4985 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 24 10:09:33 crc kubenswrapper[4985]: I0224 10:09:33.189973 4985 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 24 10:09:33 crc kubenswrapper[4985]: I0224 10:09:33.537127 4985 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/3.log" Feb 24 10:09:33 crc kubenswrapper[4985]: I0224 10:09:33.539053 4985 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 24 10:09:33 crc kubenswrapper[4985]: I0224 10:09:33.540184 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:09:33 crc kubenswrapper[4985]: I0224 10:09:33.540214 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:09:33 crc kubenswrapper[4985]: I0224 10:09:33.540225 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:09:33 crc kubenswrapper[4985]: I0224 10:09:33.540691 4985 scope.go:117] "RemoveContainer" containerID="6ee8b871ab25b79ee67f4e5735673469d19cfb336696170957d2439b0bca13af" Feb 24 10:09:33 crc kubenswrapper[4985]: E0224 10:09:33.540873 4985 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Feb 24 10:09:34 crc kubenswrapper[4985]: I0224 10:09:34.191969 4985 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 24 10:09:35 crc kubenswrapper[4985]: I0224 10:09:35.190132 4985 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 24 10:09:35 crc kubenswrapper[4985]: I0224 10:09:35.896499 4985 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 24 10:09:35 crc kubenswrapper[4985]: I0224 10:09:35.896701 4985 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 24 10:09:35 crc kubenswrapper[4985]: I0224 10:09:35.897747 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:09:35 crc kubenswrapper[4985]: I0224 10:09:35.897782 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:09:35 crc kubenswrapper[4985]: I0224 10:09:35.897792 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:09:35 crc kubenswrapper[4985]: I0224 10:09:35.898409 4985 scope.go:117] "RemoveContainer" containerID="6ee8b871ab25b79ee67f4e5735673469d19cfb336696170957d2439b0bca13af" Feb 24 10:09:35 crc kubenswrapper[4985]: E0224 10:09:35.898653 4985 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Feb 24 10:09:36 crc kubenswrapper[4985]: I0224 10:09:36.191978 4985 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 24 10:09:36 crc kubenswrapper[4985]: E0224 10:09:36.348143 4985 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Feb 24 10:09:37 crc kubenswrapper[4985]: I0224 10:09:37.190791 4985 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 24 10:09:38 crc kubenswrapper[4985]: I0224 10:09:38.191756 4985 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 24 10:09:39 crc kubenswrapper[4985]: I0224 10:09:39.161028 4985 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 24 10:09:39 crc kubenswrapper[4985]: I0224 10:09:39.162818 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:09:39 crc kubenswrapper[4985]: I0224 10:09:39.162869 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:09:39 crc kubenswrapper[4985]: I0224 10:09:39.162907 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:09:39 crc kubenswrapper[4985]: I0224 10:09:39.162942 4985 kubelet_node_status.go:76] "Attempting to register node" node="crc" Feb 24 10:09:39 crc kubenswrapper[4985]: E0224 10:09:39.167278 4985 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Feb 24 10:09:39 crc kubenswrapper[4985]: E0224 10:09:39.167354 4985 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Feb 24 10:09:39 crc kubenswrapper[4985]: I0224 10:09:39.187791 4985 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 24 10:09:40 crc kubenswrapper[4985]: I0224 10:09:40.191937 4985 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 24 10:09:41 crc kubenswrapper[4985]: I0224 10:09:41.194412 4985 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 24 10:09:41 crc kubenswrapper[4985]: I0224 10:09:41.553390 4985 csr.go:261] certificate signing request csr-6mhtd is approved, waiting to be issued Feb 24 10:09:41 crc kubenswrapper[4985]: I0224 10:09:41.561324 4985 csr.go:257] certificate signing request csr-6mhtd is issued Feb 24 10:09:41 crc kubenswrapper[4985]: I0224 10:09:41.579463 4985 reconstruct.go:205] "DevicePaths of reconstructed volumes updated" Feb 24 10:09:41 crc kubenswrapper[4985]: I0224 10:09:41.984929 4985 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Feb 24 10:09:42 crc kubenswrapper[4985]: I0224 10:09:42.294921 4985 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Feb 24 10:09:42 crc kubenswrapper[4985]: I0224 10:09:42.562170 4985 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate expiration is 2027-02-24 05:54:36 +0000 UTC, rotation deadline is 2026-12-09 00:51:49.233519957 +0000 UTC Feb 24 10:09:42 crc kubenswrapper[4985]: I0224 10:09:42.562209 4985 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Waiting 6902h42m6.671313359s for next certificate rotation Feb 24 10:09:46 crc kubenswrapper[4985]: I0224 10:09:46.167803 4985 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 24 10:09:46 crc kubenswrapper[4985]: I0224 10:09:46.168867 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:09:46 crc kubenswrapper[4985]: I0224 10:09:46.168952 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:09:46 crc kubenswrapper[4985]: I0224 10:09:46.168968 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:09:46 crc kubenswrapper[4985]: I0224 10:09:46.169124 4985 kubelet_node_status.go:76] "Attempting to register node" node="crc" Feb 24 10:09:46 crc kubenswrapper[4985]: I0224 10:09:46.175726 4985 kubelet_node_status.go:115] "Node was previously registered" node="crc" Feb 24 10:09:46 crc kubenswrapper[4985]: I0224 10:09:46.176038 4985 kubelet_node_status.go:79] "Successfully registered node" node="crc" Feb 24 10:09:46 crc kubenswrapper[4985]: E0224 10:09:46.176067 4985 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": node \"crc\" not found" Feb 24 10:09:46 crc kubenswrapper[4985]: I0224 10:09:46.178600 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:09:46 crc kubenswrapper[4985]: I0224 10:09:46.178624 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:09:46 crc kubenswrapper[4985]: I0224 10:09:46.178632 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:09:46 crc kubenswrapper[4985]: I0224 10:09:46.178646 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 10:09:46 crc kubenswrapper[4985]: I0224 10:09:46.178655 4985 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T10:09:46Z","lastTransitionTime":"2026-02-24T10:09:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 10:09:46 crc kubenswrapper[4985]: E0224 10:09:46.192500 4985 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T10:09:46Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T10:09:46Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T10:09:46Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T10:09:46Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T10:09:46Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T10:09:46Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T10:09:46Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T10:09:46Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"6fa96d71-6980-4f45-b02a-32602082091f\\\",\\\"systemUUID\\\":\\\"77848161-2cfb-4f0a-8e60-0ac9426710fd\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 24 10:09:46 crc kubenswrapper[4985]: I0224 10:09:46.196290 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:09:46 crc kubenswrapper[4985]: I0224 10:09:46.196330 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:09:46 crc kubenswrapper[4985]: I0224 10:09:46.196338 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:09:46 crc kubenswrapper[4985]: I0224 10:09:46.196355 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 10:09:46 crc kubenswrapper[4985]: I0224 10:09:46.196366 4985 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T10:09:46Z","lastTransitionTime":"2026-02-24T10:09:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 10:09:46 crc kubenswrapper[4985]: E0224 10:09:46.206870 4985 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T10:09:46Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T10:09:46Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T10:09:46Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T10:09:46Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T10:09:46Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T10:09:46Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T10:09:46Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T10:09:46Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"6fa96d71-6980-4f45-b02a-32602082091f\\\",\\\"systemUUID\\\":\\\"77848161-2cfb-4f0a-8e60-0ac9426710fd\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 24 10:09:46 crc kubenswrapper[4985]: I0224 10:09:46.210748 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:09:46 crc kubenswrapper[4985]: I0224 10:09:46.210787 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:09:46 crc kubenswrapper[4985]: I0224 10:09:46.210796 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:09:46 crc kubenswrapper[4985]: I0224 10:09:46.210812 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 10:09:46 crc kubenswrapper[4985]: I0224 10:09:46.210824 4985 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T10:09:46Z","lastTransitionTime":"2026-02-24T10:09:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 10:09:46 crc kubenswrapper[4985]: E0224 10:09:46.221463 4985 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T10:09:46Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T10:09:46Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T10:09:46Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T10:09:46Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T10:09:46Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T10:09:46Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T10:09:46Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T10:09:46Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"6fa96d71-6980-4f45-b02a-32602082091f\\\",\\\"systemUUID\\\":\\\"77848161-2cfb-4f0a-8e60-0ac9426710fd\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 24 10:09:46 crc kubenswrapper[4985]: I0224 10:09:46.225263 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:09:46 crc kubenswrapper[4985]: I0224 10:09:46.225315 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:09:46 crc kubenswrapper[4985]: I0224 10:09:46.225329 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:09:46 crc kubenswrapper[4985]: I0224 10:09:46.225352 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 10:09:46 crc kubenswrapper[4985]: I0224 10:09:46.225365 4985 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T10:09:46Z","lastTransitionTime":"2026-02-24T10:09:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 10:09:46 crc kubenswrapper[4985]: E0224 10:09:46.236526 4985 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T10:09:46Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T10:09:46Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T10:09:46Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T10:09:46Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T10:09:46Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T10:09:46Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T10:09:46Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T10:09:46Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"6fa96d71-6980-4f45-b02a-32602082091f\\\",\\\"systemUUID\\\":\\\"77848161-2cfb-4f0a-8e60-0ac9426710fd\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 24 10:09:46 crc kubenswrapper[4985]: E0224 10:09:46.236650 4985 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Feb 24 10:09:46 crc kubenswrapper[4985]: E0224 10:09:46.236676 4985 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 10:09:46 crc kubenswrapper[4985]: E0224 10:09:46.337737 4985 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 10:09:46 crc kubenswrapper[4985]: E0224 10:09:46.349107 4985 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Feb 24 10:09:46 crc kubenswrapper[4985]: E0224 10:09:46.438607 4985 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 10:09:46 crc kubenswrapper[4985]: E0224 10:09:46.539440 4985 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 10:09:46 crc kubenswrapper[4985]: E0224 10:09:46.639954 4985 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 10:09:46 crc kubenswrapper[4985]: E0224 10:09:46.741055 4985 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 10:09:46 crc kubenswrapper[4985]: E0224 10:09:46.841910 4985 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 10:09:46 crc kubenswrapper[4985]: E0224 10:09:46.942729 4985 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 10:09:47 crc kubenswrapper[4985]: E0224 10:09:47.043458 4985 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 10:09:47 crc kubenswrapper[4985]: E0224 10:09:47.144550 4985 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 10:09:47 crc kubenswrapper[4985]: E0224 10:09:47.245497 4985 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 10:09:47 crc kubenswrapper[4985]: E0224 10:09:47.346491 4985 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 10:09:47 crc kubenswrapper[4985]: E0224 10:09:47.447400 4985 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 10:09:47 crc kubenswrapper[4985]: E0224 10:09:47.548328 4985 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 10:09:47 crc kubenswrapper[4985]: E0224 10:09:47.648750 4985 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 10:09:47 crc kubenswrapper[4985]: E0224 10:09:47.749306 4985 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 10:09:47 crc kubenswrapper[4985]: E0224 10:09:47.850406 4985 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 10:09:47 crc kubenswrapper[4985]: E0224 10:09:47.951214 4985 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 10:09:48 crc kubenswrapper[4985]: E0224 10:09:48.052385 4985 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 10:09:48 crc kubenswrapper[4985]: E0224 10:09:48.152513 4985 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 10:09:48 crc kubenswrapper[4985]: E0224 10:09:48.253463 4985 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 10:09:48 crc kubenswrapper[4985]: E0224 10:09:48.354027 4985 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 10:09:48 crc kubenswrapper[4985]: E0224 10:09:48.454951 4985 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 10:09:48 crc kubenswrapper[4985]: E0224 10:09:48.556239 4985 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 10:09:48 crc kubenswrapper[4985]: E0224 10:09:48.657374 4985 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 10:09:48 crc kubenswrapper[4985]: E0224 10:09:48.758121 4985 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 10:09:48 crc kubenswrapper[4985]: E0224 10:09:48.859156 4985 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 10:09:48 crc kubenswrapper[4985]: E0224 10:09:48.960133 4985 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 10:09:49 crc kubenswrapper[4985]: E0224 10:09:49.061346 4985 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 10:09:49 crc kubenswrapper[4985]: E0224 10:09:49.162306 4985 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 10:09:49 crc kubenswrapper[4985]: E0224 10:09:49.262682 4985 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 10:09:49 crc kubenswrapper[4985]: E0224 10:09:49.363751 4985 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 10:09:49 crc kubenswrapper[4985]: E0224 10:09:49.464822 4985 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 10:09:49 crc kubenswrapper[4985]: E0224 10:09:49.565690 4985 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 10:09:49 crc kubenswrapper[4985]: E0224 10:09:49.666469 4985 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 10:09:49 crc kubenswrapper[4985]: E0224 10:09:49.766848 4985 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 10:09:49 crc kubenswrapper[4985]: E0224 10:09:49.868009 4985 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 10:09:49 crc kubenswrapper[4985]: E0224 10:09:49.968095 4985 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 10:09:50 crc kubenswrapper[4985]: E0224 10:09:50.070669 4985 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 10:09:50 crc kubenswrapper[4985]: E0224 10:09:50.171678 4985 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 10:09:50 crc kubenswrapper[4985]: I0224 10:09:50.264485 4985 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 24 10:09:50 crc kubenswrapper[4985]: I0224 10:09:50.265552 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:09:50 crc kubenswrapper[4985]: I0224 10:09:50.265597 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:09:50 crc kubenswrapper[4985]: I0224 10:09:50.265610 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:09:50 crc kubenswrapper[4985]: I0224 10:09:50.266783 4985 scope.go:117] "RemoveContainer" containerID="6ee8b871ab25b79ee67f4e5735673469d19cfb336696170957d2439b0bca13af" Feb 24 10:09:50 crc kubenswrapper[4985]: E0224 10:09:50.266988 4985 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Feb 24 10:09:50 crc kubenswrapper[4985]: E0224 10:09:50.272626 4985 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 10:09:50 crc kubenswrapper[4985]: E0224 10:09:50.372914 4985 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 10:09:50 crc kubenswrapper[4985]: E0224 10:09:50.473272 4985 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 10:09:50 crc kubenswrapper[4985]: E0224 10:09:50.573390 4985 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 10:09:50 crc kubenswrapper[4985]: E0224 10:09:50.674101 4985 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 10:09:50 crc kubenswrapper[4985]: E0224 10:09:50.774702 4985 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 10:09:50 crc kubenswrapper[4985]: E0224 10:09:50.875697 4985 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 10:09:50 crc kubenswrapper[4985]: E0224 10:09:50.976475 4985 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 10:09:51 crc kubenswrapper[4985]: E0224 10:09:51.076793 4985 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 10:09:51 crc kubenswrapper[4985]: E0224 10:09:51.177788 4985 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 10:09:51 crc kubenswrapper[4985]: E0224 10:09:51.278510 4985 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 10:09:51 crc kubenswrapper[4985]: E0224 10:09:51.378883 4985 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 10:09:51 crc kubenswrapper[4985]: E0224 10:09:51.479250 4985 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 10:09:51 crc kubenswrapper[4985]: E0224 10:09:51.579731 4985 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 10:09:51 crc kubenswrapper[4985]: E0224 10:09:51.680281 4985 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 10:09:51 crc kubenswrapper[4985]: E0224 10:09:51.781335 4985 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 10:09:51 crc kubenswrapper[4985]: E0224 10:09:51.882321 4985 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 10:09:51 crc kubenswrapper[4985]: E0224 10:09:51.983262 4985 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 10:09:52 crc kubenswrapper[4985]: E0224 10:09:52.083386 4985 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 10:09:52 crc kubenswrapper[4985]: E0224 10:09:52.183974 4985 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 10:09:52 crc kubenswrapper[4985]: E0224 10:09:52.284603 4985 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 10:09:52 crc kubenswrapper[4985]: E0224 10:09:52.385574 4985 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 10:09:52 crc kubenswrapper[4985]: E0224 10:09:52.485768 4985 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 10:09:52 crc kubenswrapper[4985]: E0224 10:09:52.585869 4985 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 10:09:52 crc kubenswrapper[4985]: E0224 10:09:52.686659 4985 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 10:09:52 crc kubenswrapper[4985]: E0224 10:09:52.787766 4985 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 10:09:52 crc kubenswrapper[4985]: E0224 10:09:52.888444 4985 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 10:09:52 crc kubenswrapper[4985]: E0224 10:09:52.989650 4985 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 10:09:53 crc kubenswrapper[4985]: E0224 10:09:53.090487 4985 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 10:09:53 crc kubenswrapper[4985]: E0224 10:09:53.190936 4985 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 10:09:53 crc kubenswrapper[4985]: E0224 10:09:53.291778 4985 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 10:09:53 crc kubenswrapper[4985]: E0224 10:09:53.392881 4985 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 10:09:53 crc kubenswrapper[4985]: E0224 10:09:53.493910 4985 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 10:09:53 crc kubenswrapper[4985]: E0224 10:09:53.594367 4985 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 10:09:53 crc kubenswrapper[4985]: E0224 10:09:53.694839 4985 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 10:09:53 crc kubenswrapper[4985]: E0224 10:09:53.795869 4985 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 10:09:53 crc kubenswrapper[4985]: E0224 10:09:53.896482 4985 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 10:09:53 crc kubenswrapper[4985]: E0224 10:09:53.997651 4985 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 10:09:54 crc kubenswrapper[4985]: E0224 10:09:54.098351 4985 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 10:09:54 crc kubenswrapper[4985]: E0224 10:09:54.198874 4985 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 10:09:54 crc kubenswrapper[4985]: E0224 10:09:54.299389 4985 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 10:09:54 crc kubenswrapper[4985]: E0224 10:09:54.400401 4985 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 10:09:54 crc kubenswrapper[4985]: E0224 10:09:54.501221 4985 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 10:09:54 crc kubenswrapper[4985]: E0224 10:09:54.602195 4985 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 10:09:54 crc kubenswrapper[4985]: E0224 10:09:54.702546 4985 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 10:09:54 crc kubenswrapper[4985]: E0224 10:09:54.803259 4985 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 10:09:54 crc kubenswrapper[4985]: E0224 10:09:54.904364 4985 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 10:09:55 crc kubenswrapper[4985]: E0224 10:09:55.005493 4985 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 10:09:55 crc kubenswrapper[4985]: E0224 10:09:55.106689 4985 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 10:09:55 crc kubenswrapper[4985]: E0224 10:09:55.206874 4985 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 10:09:55 crc kubenswrapper[4985]: E0224 10:09:55.307328 4985 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 10:09:55 crc kubenswrapper[4985]: E0224 10:09:55.408413 4985 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 10:09:55 crc kubenswrapper[4985]: E0224 10:09:55.509537 4985 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 10:09:55 crc kubenswrapper[4985]: E0224 10:09:55.610457 4985 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 10:09:55 crc kubenswrapper[4985]: E0224 10:09:55.711208 4985 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 10:09:55 crc kubenswrapper[4985]: E0224 10:09:55.811570 4985 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 10:09:55 crc kubenswrapper[4985]: I0224 10:09:55.837357 4985 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Feb 24 10:09:55 crc kubenswrapper[4985]: E0224 10:09:55.912166 4985 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 10:09:56 crc kubenswrapper[4985]: E0224 10:09:56.012851 4985 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 10:09:56 crc kubenswrapper[4985]: E0224 10:09:56.113858 4985 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 10:09:56 crc kubenswrapper[4985]: E0224 10:09:56.214349 4985 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 10:09:56 crc kubenswrapper[4985]: E0224 10:09:56.293933 4985 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": node \"crc\" not found" Feb 24 10:09:56 crc kubenswrapper[4985]: I0224 10:09:56.302106 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:09:56 crc kubenswrapper[4985]: I0224 10:09:56.302178 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:09:56 crc kubenswrapper[4985]: I0224 10:09:56.302194 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:09:56 crc kubenswrapper[4985]: I0224 10:09:56.302217 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 10:09:56 crc kubenswrapper[4985]: I0224 10:09:56.302234 4985 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T10:09:56Z","lastTransitionTime":"2026-02-24T10:09:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 10:09:56 crc kubenswrapper[4985]: E0224 10:09:56.320802 4985 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T10:09:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T10:09:56Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T10:09:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T10:09:56Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T10:09:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T10:09:56Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T10:09:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T10:09:56Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"6fa96d71-6980-4f45-b02a-32602082091f\\\",\\\"systemUUID\\\":\\\"77848161-2cfb-4f0a-8e60-0ac9426710fd\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 24 10:09:56 crc kubenswrapper[4985]: I0224 10:09:56.327727 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:09:56 crc kubenswrapper[4985]: I0224 10:09:56.327772 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:09:56 crc kubenswrapper[4985]: I0224 10:09:56.327783 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:09:56 crc kubenswrapper[4985]: I0224 10:09:56.327801 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 10:09:56 crc kubenswrapper[4985]: I0224 10:09:56.327812 4985 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T10:09:56Z","lastTransitionTime":"2026-02-24T10:09:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 10:09:56 crc kubenswrapper[4985]: E0224 10:09:56.341318 4985 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T10:09:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T10:09:56Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T10:09:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T10:09:56Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T10:09:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T10:09:56Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T10:09:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T10:09:56Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"6fa96d71-6980-4f45-b02a-32602082091f\\\",\\\"systemUUID\\\":\\\"77848161-2cfb-4f0a-8e60-0ac9426710fd\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 24 10:09:56 crc kubenswrapper[4985]: I0224 10:09:56.346495 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:09:56 crc kubenswrapper[4985]: I0224 10:09:56.346587 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:09:56 crc kubenswrapper[4985]: I0224 10:09:56.346615 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:09:56 crc kubenswrapper[4985]: I0224 10:09:56.346652 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 10:09:56 crc kubenswrapper[4985]: I0224 10:09:56.346677 4985 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T10:09:56Z","lastTransitionTime":"2026-02-24T10:09:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 10:09:56 crc kubenswrapper[4985]: E0224 10:09:56.349702 4985 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Feb 24 10:09:56 crc kubenswrapper[4985]: E0224 10:09:56.358722 4985 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T10:09:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T10:09:56Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T10:09:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T10:09:56Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T10:09:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T10:09:56Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T10:09:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T10:09:56Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"6fa96d71-6980-4f45-b02a-32602082091f\\\",\\\"systemUUID\\\":\\\"77848161-2cfb-4f0a-8e60-0ac9426710fd\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 24 10:09:56 crc kubenswrapper[4985]: I0224 10:09:56.362862 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:09:56 crc kubenswrapper[4985]: I0224 10:09:56.362937 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:09:56 crc kubenswrapper[4985]: I0224 10:09:56.362952 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:09:56 crc kubenswrapper[4985]: I0224 10:09:56.362973 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 10:09:56 crc kubenswrapper[4985]: I0224 10:09:56.362990 4985 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T10:09:56Z","lastTransitionTime":"2026-02-24T10:09:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 10:09:56 crc kubenswrapper[4985]: E0224 10:09:56.374996 4985 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T10:09:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T10:09:56Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T10:09:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T10:09:56Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T10:09:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T10:09:56Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T10:09:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T10:09:56Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"6fa96d71-6980-4f45-b02a-32602082091f\\\",\\\"systemUUID\\\":\\\"77848161-2cfb-4f0a-8e60-0ac9426710fd\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 24 10:09:56 crc kubenswrapper[4985]: E0224 10:09:56.375168 4985 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Feb 24 10:09:56 crc kubenswrapper[4985]: E0224 10:09:56.375209 4985 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 10:09:56 crc kubenswrapper[4985]: E0224 10:09:56.475615 4985 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 10:09:56 crc kubenswrapper[4985]: E0224 10:09:56.576518 4985 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 10:09:56 crc kubenswrapper[4985]: E0224 10:09:56.677044 4985 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 10:09:56 crc kubenswrapper[4985]: E0224 10:09:56.777609 4985 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 10:09:56 crc kubenswrapper[4985]: E0224 10:09:56.878453 4985 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 10:09:56 crc kubenswrapper[4985]: E0224 10:09:56.979420 4985 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 10:09:57 crc kubenswrapper[4985]: E0224 10:09:57.079699 4985 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 10:09:57 crc kubenswrapper[4985]: E0224 10:09:57.180551 4985 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 10:09:57 crc kubenswrapper[4985]: E0224 10:09:57.281067 4985 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 10:09:57 crc kubenswrapper[4985]: E0224 10:09:57.381512 4985 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 10:09:57 crc kubenswrapper[4985]: E0224 10:09:57.481915 4985 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 10:09:57 crc kubenswrapper[4985]: E0224 10:09:57.583105 4985 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 10:09:57 crc kubenswrapper[4985]: E0224 10:09:57.684092 4985 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 10:09:57 crc kubenswrapper[4985]: E0224 10:09:57.786639 4985 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 10:09:57 crc kubenswrapper[4985]: E0224 10:09:57.887453 4985 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 10:09:57 crc kubenswrapper[4985]: E0224 10:09:57.988319 4985 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 10:09:58 crc kubenswrapper[4985]: E0224 10:09:58.088868 4985 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 10:09:58 crc kubenswrapper[4985]: E0224 10:09:58.189076 4985 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 10:09:58 crc kubenswrapper[4985]: E0224 10:09:58.289574 4985 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 10:09:58 crc kubenswrapper[4985]: E0224 10:09:58.390282 4985 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 10:09:58 crc kubenswrapper[4985]: E0224 10:09:58.490814 4985 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 10:09:58 crc kubenswrapper[4985]: E0224 10:09:58.591312 4985 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 10:09:58 crc kubenswrapper[4985]: E0224 10:09:58.692516 4985 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 10:09:58 crc kubenswrapper[4985]: E0224 10:09:58.793129 4985 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 10:09:58 crc kubenswrapper[4985]: E0224 10:09:58.894119 4985 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 10:09:58 crc kubenswrapper[4985]: E0224 10:09:58.995132 4985 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 10:09:59 crc kubenswrapper[4985]: E0224 10:09:59.095837 4985 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 10:09:59 crc kubenswrapper[4985]: E0224 10:09:59.196180 4985 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 10:09:59 crc kubenswrapper[4985]: E0224 10:09:59.296370 4985 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 10:09:59 crc kubenswrapper[4985]: E0224 10:09:59.397189 4985 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 10:09:59 crc kubenswrapper[4985]: E0224 10:09:59.497972 4985 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 10:09:59 crc kubenswrapper[4985]: E0224 10:09:59.598544 4985 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 10:09:59 crc kubenswrapper[4985]: E0224 10:09:59.699637 4985 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 10:09:59 crc kubenswrapper[4985]: E0224 10:09:59.800791 4985 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 10:09:59 crc kubenswrapper[4985]: E0224 10:09:59.901144 4985 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 10:10:00 crc kubenswrapper[4985]: E0224 10:10:00.002190 4985 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 10:10:00 crc kubenswrapper[4985]: E0224 10:10:00.103368 4985 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 10:10:00 crc kubenswrapper[4985]: E0224 10:10:00.203940 4985 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 10:10:00 crc kubenswrapper[4985]: E0224 10:10:00.304464 4985 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 10:10:00 crc kubenswrapper[4985]: E0224 10:10:00.405224 4985 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 10:10:00 crc kubenswrapper[4985]: E0224 10:10:00.506254 4985 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 10:10:00 crc kubenswrapper[4985]: E0224 10:10:00.606837 4985 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 10:10:00 crc kubenswrapper[4985]: E0224 10:10:00.707337 4985 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 10:10:00 crc kubenswrapper[4985]: E0224 10:10:00.807695 4985 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 10:10:00 crc kubenswrapper[4985]: E0224 10:10:00.908682 4985 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 10:10:01 crc kubenswrapper[4985]: E0224 10:10:01.008954 4985 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 10:10:01 crc kubenswrapper[4985]: E0224 10:10:01.110127 4985 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 10:10:01 crc kubenswrapper[4985]: E0224 10:10:01.211043 4985 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 10:10:01 crc kubenswrapper[4985]: E0224 10:10:01.312141 4985 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 10:10:01 crc kubenswrapper[4985]: E0224 10:10:01.412266 4985 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 10:10:01 crc kubenswrapper[4985]: E0224 10:10:01.512404 4985 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 10:10:01 crc kubenswrapper[4985]: E0224 10:10:01.612671 4985 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 10:10:01 crc kubenswrapper[4985]: E0224 10:10:01.713010 4985 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 10:10:01 crc kubenswrapper[4985]: E0224 10:10:01.814207 4985 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 10:10:01 crc kubenswrapper[4985]: E0224 10:10:01.915330 4985 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 10:10:02 crc kubenswrapper[4985]: E0224 10:10:02.015781 4985 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 10:10:02 crc kubenswrapper[4985]: E0224 10:10:02.117005 4985 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 10:10:02 crc kubenswrapper[4985]: E0224 10:10:02.217832 4985 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 10:10:02 crc kubenswrapper[4985]: E0224 10:10:02.318712 4985 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 10:10:02 crc kubenswrapper[4985]: E0224 10:10:02.419612 4985 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 10:10:02 crc kubenswrapper[4985]: E0224 10:10:02.520754 4985 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 10:10:02 crc kubenswrapper[4985]: E0224 10:10:02.621387 4985 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 10:10:02 crc kubenswrapper[4985]: E0224 10:10:02.722255 4985 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 10:10:02 crc kubenswrapper[4985]: E0224 10:10:02.823009 4985 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 10:10:02 crc kubenswrapper[4985]: E0224 10:10:02.924179 4985 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 10:10:03 crc kubenswrapper[4985]: E0224 10:10:03.025186 4985 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 10:10:03 crc kubenswrapper[4985]: E0224 10:10:03.126341 4985 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 10:10:03 crc kubenswrapper[4985]: E0224 10:10:03.227166 4985 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 10:10:03 crc kubenswrapper[4985]: I0224 10:10:03.263844 4985 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 24 10:10:03 crc kubenswrapper[4985]: I0224 10:10:03.265522 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:10:03 crc kubenswrapper[4985]: I0224 10:10:03.265585 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:10:03 crc kubenswrapper[4985]: I0224 10:10:03.265608 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:10:03 crc kubenswrapper[4985]: I0224 10:10:03.266733 4985 scope.go:117] "RemoveContainer" containerID="6ee8b871ab25b79ee67f4e5735673469d19cfb336696170957d2439b0bca13af" Feb 24 10:10:03 crc kubenswrapper[4985]: E0224 10:10:03.267283 4985 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Feb 24 10:10:03 crc kubenswrapper[4985]: E0224 10:10:03.328178 4985 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 10:10:03 crc kubenswrapper[4985]: E0224 10:10:03.428737 4985 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 10:10:03 crc kubenswrapper[4985]: E0224 10:10:03.529326 4985 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 10:10:03 crc kubenswrapper[4985]: E0224 10:10:03.629680 4985 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 10:10:03 crc kubenswrapper[4985]: E0224 10:10:03.730809 4985 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 10:10:03 crc kubenswrapper[4985]: E0224 10:10:03.831851 4985 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 10:10:03 crc kubenswrapper[4985]: E0224 10:10:03.932935 4985 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 10:10:04 crc kubenswrapper[4985]: E0224 10:10:04.034141 4985 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 10:10:04 crc kubenswrapper[4985]: E0224 10:10:04.135324 4985 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 10:10:04 crc kubenswrapper[4985]: E0224 10:10:04.236234 4985 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 10:10:04 crc kubenswrapper[4985]: E0224 10:10:04.337125 4985 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 10:10:04 crc kubenswrapper[4985]: E0224 10:10:04.437660 4985 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 10:10:04 crc kubenswrapper[4985]: E0224 10:10:04.538795 4985 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 10:10:04 crc kubenswrapper[4985]: E0224 10:10:04.639919 4985 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 10:10:04 crc kubenswrapper[4985]: E0224 10:10:04.740070 4985 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 10:10:04 crc kubenswrapper[4985]: E0224 10:10:04.840716 4985 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 10:10:04 crc kubenswrapper[4985]: E0224 10:10:04.941979 4985 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 10:10:05 crc kubenswrapper[4985]: E0224 10:10:05.043015 4985 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 10:10:05 crc kubenswrapper[4985]: E0224 10:10:05.144125 4985 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 10:10:05 crc kubenswrapper[4985]: E0224 10:10:05.244333 4985 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 10:10:05 crc kubenswrapper[4985]: E0224 10:10:05.344779 4985 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 10:10:05 crc kubenswrapper[4985]: E0224 10:10:05.445815 4985 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 10:10:05 crc kubenswrapper[4985]: E0224 10:10:05.546378 4985 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 10:10:05 crc kubenswrapper[4985]: E0224 10:10:05.646849 4985 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 10:10:05 crc kubenswrapper[4985]: E0224 10:10:05.747666 4985 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 10:10:05 crc kubenswrapper[4985]: E0224 10:10:05.848679 4985 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 10:10:05 crc kubenswrapper[4985]: E0224 10:10:05.948952 4985 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 10:10:06 crc kubenswrapper[4985]: I0224 10:10:06.024576 4985 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Feb 24 10:10:06 crc kubenswrapper[4985]: E0224 10:10:06.049943 4985 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 10:10:06 crc kubenswrapper[4985]: E0224 10:10:06.150555 4985 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 10:10:06 crc kubenswrapper[4985]: E0224 10:10:06.251177 4985 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 10:10:06 crc kubenswrapper[4985]: E0224 10:10:06.350708 4985 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Feb 24 10:10:06 crc kubenswrapper[4985]: E0224 10:10:06.351792 4985 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 10:10:06 crc kubenswrapper[4985]: E0224 10:10:06.452746 4985 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 10:10:06 crc kubenswrapper[4985]: E0224 10:10:06.512219 4985 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": node \"crc\" not found" Feb 24 10:10:06 crc kubenswrapper[4985]: I0224 10:10:06.517787 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:10:06 crc kubenswrapper[4985]: I0224 10:10:06.517842 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:10:06 crc kubenswrapper[4985]: I0224 10:10:06.517860 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:10:06 crc kubenswrapper[4985]: I0224 10:10:06.517932 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 10:10:06 crc kubenswrapper[4985]: I0224 10:10:06.517957 4985 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T10:10:06Z","lastTransitionTime":"2026-02-24T10:10:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 10:10:06 crc kubenswrapper[4985]: E0224 10:10:06.537590 4985 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T10:10:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:06Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T10:10:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:06Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T10:10:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:06Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T10:10:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:06Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"6fa96d71-6980-4f45-b02a-32602082091f\\\",\\\"systemUUID\\\":\\\"77848161-2cfb-4f0a-8e60-0ac9426710fd\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 24 10:10:06 crc kubenswrapper[4985]: I0224 10:10:06.552661 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:10:06 crc kubenswrapper[4985]: I0224 10:10:06.552714 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:10:06 crc kubenswrapper[4985]: I0224 10:10:06.552730 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:10:06 crc kubenswrapper[4985]: I0224 10:10:06.552752 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 10:10:06 crc kubenswrapper[4985]: I0224 10:10:06.552769 4985 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T10:10:06Z","lastTransitionTime":"2026-02-24T10:10:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 10:10:06 crc kubenswrapper[4985]: E0224 10:10:06.566489 4985 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T10:10:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:06Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T10:10:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:06Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T10:10:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:06Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T10:10:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:06Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"6fa96d71-6980-4f45-b02a-32602082091f\\\",\\\"systemUUID\\\":\\\"77848161-2cfb-4f0a-8e60-0ac9426710fd\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 24 10:10:06 crc kubenswrapper[4985]: I0224 10:10:06.572401 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:10:06 crc kubenswrapper[4985]: I0224 10:10:06.572455 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:10:06 crc kubenswrapper[4985]: I0224 10:10:06.572469 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:10:06 crc kubenswrapper[4985]: I0224 10:10:06.572488 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 10:10:06 crc kubenswrapper[4985]: I0224 10:10:06.572501 4985 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T10:10:06Z","lastTransitionTime":"2026-02-24T10:10:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 10:10:06 crc kubenswrapper[4985]: E0224 10:10:06.585548 4985 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T10:10:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:06Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T10:10:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:06Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T10:10:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:06Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T10:10:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:06Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"6fa96d71-6980-4f45-b02a-32602082091f\\\",\\\"systemUUID\\\":\\\"77848161-2cfb-4f0a-8e60-0ac9426710fd\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 24 10:10:06 crc kubenswrapper[4985]: I0224 10:10:06.590880 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:10:06 crc kubenswrapper[4985]: I0224 10:10:06.590964 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:10:06 crc kubenswrapper[4985]: I0224 10:10:06.590976 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:10:06 crc kubenswrapper[4985]: I0224 10:10:06.591000 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 10:10:06 crc kubenswrapper[4985]: I0224 10:10:06.591092 4985 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T10:10:06Z","lastTransitionTime":"2026-02-24T10:10:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 10:10:06 crc kubenswrapper[4985]: E0224 10:10:06.608495 4985 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T10:10:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:06Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T10:10:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:06Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T10:10:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:06Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T10:10:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:06Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"6fa96d71-6980-4f45-b02a-32602082091f\\\",\\\"systemUUID\\\":\\\"77848161-2cfb-4f0a-8e60-0ac9426710fd\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 24 10:10:06 crc kubenswrapper[4985]: E0224 10:10:06.608669 4985 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Feb 24 10:10:06 crc kubenswrapper[4985]: E0224 10:10:06.608703 4985 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 10:10:06 crc kubenswrapper[4985]: E0224 10:10:06.709670 4985 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 10:10:06 crc kubenswrapper[4985]: E0224 10:10:06.810844 4985 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 10:10:06 crc kubenswrapper[4985]: E0224 10:10:06.911947 4985 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 10:10:07 crc kubenswrapper[4985]: E0224 10:10:07.012972 4985 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 10:10:07 crc kubenswrapper[4985]: E0224 10:10:07.113538 4985 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 10:10:07 crc kubenswrapper[4985]: E0224 10:10:07.214736 4985 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 10:10:07 crc kubenswrapper[4985]: E0224 10:10:07.315770 4985 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 10:10:07 crc kubenswrapper[4985]: E0224 10:10:07.416429 4985 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 10:10:07 crc kubenswrapper[4985]: E0224 10:10:07.516610 4985 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 10:10:07 crc kubenswrapper[4985]: E0224 10:10:07.616805 4985 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 10:10:07 crc kubenswrapper[4985]: E0224 10:10:07.717982 4985 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 10:10:07 crc kubenswrapper[4985]: E0224 10:10:07.818991 4985 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 10:10:07 crc kubenswrapper[4985]: E0224 10:10:07.919468 4985 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 10:10:08 crc kubenswrapper[4985]: E0224 10:10:08.019979 4985 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 10:10:08 crc kubenswrapper[4985]: E0224 10:10:08.121147 4985 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 10:10:08 crc kubenswrapper[4985]: E0224 10:10:08.221349 4985 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 10:10:08 crc kubenswrapper[4985]: E0224 10:10:08.322055 4985 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 10:10:08 crc kubenswrapper[4985]: E0224 10:10:08.422735 4985 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 10:10:08 crc kubenswrapper[4985]: E0224 10:10:08.523836 4985 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 10:10:08 crc kubenswrapper[4985]: E0224 10:10:08.624284 4985 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 10:10:08 crc kubenswrapper[4985]: E0224 10:10:08.725378 4985 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 10:10:08 crc kubenswrapper[4985]: E0224 10:10:08.826490 4985 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 10:10:08 crc kubenswrapper[4985]: E0224 10:10:08.927043 4985 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 10:10:09 crc kubenswrapper[4985]: E0224 10:10:09.028206 4985 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 10:10:09 crc kubenswrapper[4985]: E0224 10:10:09.128521 4985 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 10:10:09 crc kubenswrapper[4985]: E0224 10:10:09.228977 4985 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 10:10:09 crc kubenswrapper[4985]: E0224 10:10:09.329750 4985 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 10:10:09 crc kubenswrapper[4985]: E0224 10:10:09.430433 4985 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 10:10:09 crc kubenswrapper[4985]: E0224 10:10:09.531432 4985 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 10:10:09 crc kubenswrapper[4985]: E0224 10:10:09.631517 4985 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 10:10:09 crc kubenswrapper[4985]: E0224 10:10:09.732407 4985 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 10:10:09 crc kubenswrapper[4985]: E0224 10:10:09.833430 4985 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 10:10:09 crc kubenswrapper[4985]: E0224 10:10:09.933882 4985 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 10:10:10 crc kubenswrapper[4985]: E0224 10:10:10.034814 4985 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 10:10:10 crc kubenswrapper[4985]: E0224 10:10:10.136024 4985 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 10:10:10 crc kubenswrapper[4985]: E0224 10:10:10.236856 4985 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 10:10:10 crc kubenswrapper[4985]: I0224 10:10:10.263861 4985 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 24 10:10:10 crc kubenswrapper[4985]: I0224 10:10:10.265396 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:10:10 crc kubenswrapper[4985]: I0224 10:10:10.265472 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:10:10 crc kubenswrapper[4985]: I0224 10:10:10.265495 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:10:10 crc kubenswrapper[4985]: E0224 10:10:10.337846 4985 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 10:10:10 crc kubenswrapper[4985]: E0224 10:10:10.438379 4985 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 10:10:10 crc kubenswrapper[4985]: E0224 10:10:10.538939 4985 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 10:10:10 crc kubenswrapper[4985]: E0224 10:10:10.639304 4985 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 10:10:10 crc kubenswrapper[4985]: E0224 10:10:10.739652 4985 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 10:10:10 crc kubenswrapper[4985]: E0224 10:10:10.839809 4985 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 10:10:10 crc kubenswrapper[4985]: E0224 10:10:10.940719 4985 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 10:10:11 crc kubenswrapper[4985]: E0224 10:10:11.041697 4985 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 10:10:11 crc kubenswrapper[4985]: E0224 10:10:11.142076 4985 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 10:10:11 crc kubenswrapper[4985]: E0224 10:10:11.242776 4985 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 10:10:11 crc kubenswrapper[4985]: E0224 10:10:11.343652 4985 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 10:10:11 crc kubenswrapper[4985]: E0224 10:10:11.444082 4985 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 10:10:11 crc kubenswrapper[4985]: E0224 10:10:11.544282 4985 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 10:10:11 crc kubenswrapper[4985]: E0224 10:10:11.645361 4985 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 10:10:11 crc kubenswrapper[4985]: E0224 10:10:11.745964 4985 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 10:10:11 crc kubenswrapper[4985]: E0224 10:10:11.846990 4985 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 10:10:11 crc kubenswrapper[4985]: E0224 10:10:11.947881 4985 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 10:10:12 crc kubenswrapper[4985]: E0224 10:10:12.048864 4985 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 10:10:12 crc kubenswrapper[4985]: E0224 10:10:12.150005 4985 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 10:10:12 crc kubenswrapper[4985]: E0224 10:10:12.250161 4985 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 10:10:12 crc kubenswrapper[4985]: I0224 10:10:12.295836 4985 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Feb 24 10:10:12 crc kubenswrapper[4985]: I0224 10:10:12.352962 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:10:12 crc kubenswrapper[4985]: I0224 10:10:12.353034 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:10:12 crc kubenswrapper[4985]: I0224 10:10:12.353053 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:10:12 crc kubenswrapper[4985]: I0224 10:10:12.353078 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 10:10:12 crc kubenswrapper[4985]: I0224 10:10:12.353096 4985 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T10:10:12Z","lastTransitionTime":"2026-02-24T10:10:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 10:10:12 crc kubenswrapper[4985]: I0224 10:10:12.455705 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:10:12 crc kubenswrapper[4985]: I0224 10:10:12.455742 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:10:12 crc kubenswrapper[4985]: I0224 10:10:12.455755 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:10:12 crc kubenswrapper[4985]: I0224 10:10:12.455771 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 10:10:12 crc kubenswrapper[4985]: I0224 10:10:12.455783 4985 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T10:10:12Z","lastTransitionTime":"2026-02-24T10:10:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 10:10:12 crc kubenswrapper[4985]: I0224 10:10:12.558606 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:10:12 crc kubenswrapper[4985]: I0224 10:10:12.558650 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:10:12 crc kubenswrapper[4985]: I0224 10:10:12.558665 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:10:12 crc kubenswrapper[4985]: I0224 10:10:12.558686 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 10:10:12 crc kubenswrapper[4985]: I0224 10:10:12.558699 4985 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T10:10:12Z","lastTransitionTime":"2026-02-24T10:10:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 10:10:12 crc kubenswrapper[4985]: I0224 10:10:12.661770 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:10:12 crc kubenswrapper[4985]: I0224 10:10:12.661835 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:10:12 crc kubenswrapper[4985]: I0224 10:10:12.661858 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:10:12 crc kubenswrapper[4985]: I0224 10:10:12.661926 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 10:10:12 crc kubenswrapper[4985]: I0224 10:10:12.661970 4985 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T10:10:12Z","lastTransitionTime":"2026-02-24T10:10:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 10:10:12 crc kubenswrapper[4985]: I0224 10:10:12.764876 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:10:12 crc kubenswrapper[4985]: I0224 10:10:12.764979 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:10:12 crc kubenswrapper[4985]: I0224 10:10:12.764996 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:10:12 crc kubenswrapper[4985]: I0224 10:10:12.765018 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 10:10:12 crc kubenswrapper[4985]: I0224 10:10:12.765035 4985 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T10:10:12Z","lastTransitionTime":"2026-02-24T10:10:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 10:10:12 crc kubenswrapper[4985]: I0224 10:10:12.867304 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:10:12 crc kubenswrapper[4985]: I0224 10:10:12.867343 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:10:12 crc kubenswrapper[4985]: I0224 10:10:12.867353 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:10:12 crc kubenswrapper[4985]: I0224 10:10:12.867369 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 10:10:12 crc kubenswrapper[4985]: I0224 10:10:12.867380 4985 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T10:10:12Z","lastTransitionTime":"2026-02-24T10:10:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 10:10:12 crc kubenswrapper[4985]: I0224 10:10:12.969258 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:10:12 crc kubenswrapper[4985]: I0224 10:10:12.969302 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:10:12 crc kubenswrapper[4985]: I0224 10:10:12.969314 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:10:12 crc kubenswrapper[4985]: I0224 10:10:12.969332 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 10:10:12 crc kubenswrapper[4985]: I0224 10:10:12.969344 4985 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T10:10:12Z","lastTransitionTime":"2026-02-24T10:10:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.072031 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.072057 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.072066 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.072080 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.072089 4985 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T10:10:13Z","lastTransitionTime":"2026-02-24T10:10:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.174945 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.174999 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.175018 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.175042 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.175060 4985 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T10:10:13Z","lastTransitionTime":"2026-02-24T10:10:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.225477 4985 apiserver.go:52] "Watching apiserver" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.232292 4985 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.232840 4985 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-network-console/networking-console-plugin-85b44fc459-gdk6g","openshift-network-operator/network-operator-58b4c7f79c-55gtf","openshift-machine-config-operator/machine-config-daemon-hq52w","openshift-network-diagnostics/network-check-target-xd92c","openshift-network-operator/iptables-alerter-4ln5h","openshift-image-registry/node-ca-jj7jq","openshift-multus/multus-additional-cni-plugins-xj9h5","openshift-multus/multus-q24bf","openshift-multus/network-metrics-daemon-xkc65","openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-kkmm8","openshift-ovn-kubernetes/ovnkube-node-27dpt","openshift-dns/node-resolver-qtqms","openshift-network-diagnostics/network-check-source-55646444c4-trplf","openshift-network-node-identity/network-node-identity-vrzqb"] Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.233351 4985 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.233397 4985 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 24 10:10:13 crc kubenswrapper[4985]: E0224 10:10:13.233433 4985 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.233478 4985 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.233369 4985 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.233651 4985 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 24 10:10:13 crc kubenswrapper[4985]: E0224 10:10:13.233640 4985 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 24 10:10:13 crc kubenswrapper[4985]: E0224 10:10:13.233738 4985 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.233982 4985 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.233979 4985 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-xj9h5" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.234275 4985 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-q24bf" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.234692 4985 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xkc65" Feb 24 10:10:13 crc kubenswrapper[4985]: E0224 10:10:13.234751 4985 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xkc65" podUID="d4340d1a-60cb-4240-87ba-1e468c9c41cf" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.234818 4985 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-27dpt" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.234966 4985 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-kkmm8" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.235061 4985 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-qtqms" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.235607 4985 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-hq52w" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.235833 4985 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.235965 4985 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-jj7jq" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.236490 4985 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.240859 4985 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.240929 4985 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.241645 4985 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.242065 4985 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.242273 4985 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.242389 4985 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.242402 4985 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.242641 4985 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.242722 4985 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.242774 4985 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.242819 4985 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.242829 4985 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.243062 4985 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.244758 4985 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.245210 4985 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.245613 4985 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.245807 4985 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.245914 4985 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.246176 4985 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.246223 4985 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.246489 4985 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.246629 4985 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.246739 4985 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.246772 4985 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.246804 4985 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.246919 4985 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.246951 4985 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.246971 4985 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.246633 4985 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.247071 4985 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.247171 4985 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.247202 4985 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.247232 4985 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.247337 4985 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.247454 4985 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.269576 4985 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.277446 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.277496 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.277505 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.277519 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.277529 4985 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T10:10:13Z","lastTransitionTime":"2026-02-24T10:10:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.285761 4985 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.296462 4985 desired_state_of_world_populator.go:154] "Finished populating initial desired state of world" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.297630 4985 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-xj9h5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c4bc690c-38e6-488d-97b2-9bf37a916fe4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svj5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svj5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svj5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svj5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svj5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svj5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svj5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:10:13Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-xj9h5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.310878 4985 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-hq52w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"11c1c7b8-18df-4583-849f-76b62544344b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kcfbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kcfbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:10:13Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-hq52w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.319347 4985 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-jj7jq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4776cd42-9a5c-4c2e-a585-a1f4a49d2d6d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rvmf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:10:13Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-jj7jq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.332998 4985 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.349823 4985 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.368785 4985 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.378419 4985 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-xkc65" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d4340d1a-60cb-4240-87ba-1e468c9c41cf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lvcfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lvcfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:10:13Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-xkc65\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.379804 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.379959 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.380083 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.380203 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.380315 4985 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T10:10:13Z","lastTransitionTime":"2026-02-24T10:10:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.382321 4985 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.382444 4985 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.382548 4985 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.382639 4985 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.382726 4985 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.382823 4985 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.382927 4985 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.383019 4985 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.383107 4985 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.383195 4985 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.383285 4985 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.383368 4985 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.383470 4985 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.383565 4985 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.382738 4985 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" (OuterVolumeSpecName: "kube-api-access-jkwtn") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "kube-api-access-jkwtn". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.383659 4985 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" (OuterVolumeSpecName: "kube-api-access-mnrrd") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "kube-api-access-mnrrd". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.382942 4985 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" (OuterVolumeSpecName: "kube-api-access-w9rds") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "kube-api-access-w9rds". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.383080 4985 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" (OuterVolumeSpecName: "audit") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "audit". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.383438 4985 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" (OuterVolumeSpecName: "kube-api-access-mg5zb") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "kube-api-access-mg5zb". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.383450 4985 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.383633 4985 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.383633 4985 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" (OuterVolumeSpecName: "kube-api-access-w4xd4") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "kube-api-access-w4xd4". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.383784 4985 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" (OuterVolumeSpecName: "kube-api-access-xcphl") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "kube-api-access-xcphl". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.383795 4985 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.384007 4985 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" (OuterVolumeSpecName: "utilities") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.384016 4985 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" (OuterVolumeSpecName: "samples-operator-tls") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "samples-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.384023 4985 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.384168 4985 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.384355 4985 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.384473 4985 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.384580 4985 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.384679 4985 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.384775 4985 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.384881 4985 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") pod \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\" (UID: \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\") " Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.385029 4985 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.385120 4985 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.385203 4985 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.385294 4985 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.385400 4985 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.384646 4985 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" (OuterVolumeSpecName: "kube-api-access-pj782") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "kube-api-access-pj782". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.384700 4985 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.384768 4985 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.384920 4985 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" (OuterVolumeSpecName: "certs") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.385225 4985 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" (OuterVolumeSpecName: "kube-api-access-jhbk2") pod "bd23aa5c-e532-4e53-bccf-e79f130c5ae8" (UID: "bd23aa5c-e532-4e53-bccf-e79f130c5ae8"). InnerVolumeSpecName "kube-api-access-jhbk2". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.385370 4985 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.385490 4985 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.385582 4985 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.385602 4985 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.385627 4985 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.385674 4985 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.385770 4985 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.385786 4985 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.385801 4985 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.385815 4985 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.385829 4985 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.385843 4985 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.385857 4985 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.385872 4985 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.385465 4985 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" (OuterVolumeSpecName: "multus-daemon-config") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "multus-daemon-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.385910 4985 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.385492 4985 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" (OuterVolumeSpecName: "client-ca") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.385644 4985 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" (OuterVolumeSpecName: "kube-api-access-fcqwp") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "kube-api-access-fcqwp". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.385872 4985 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.385927 4985 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.385973 4985 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.385999 4985 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.386023 4985 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.386046 4985 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.386066 4985 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.386085 4985 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.386098 4985 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" (OuterVolumeSpecName: "images") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.386108 4985 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.386141 4985 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.386167 4985 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.386198 4985 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.386224 4985 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.386247 4985 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.386271 4985 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.386276 4985 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" (OuterVolumeSpecName: "default-certificate") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "default-certificate". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.386287 4985 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.386297 4985 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.386322 4985 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.386344 4985 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.386365 4985 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.386389 4985 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.386411 4985 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.386434 4985 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.386459 4985 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.386482 4985 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.386511 4985 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.386533 4985 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.386555 4985 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.386580 4985 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.386603 4985 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.386625 4985 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.386649 4985 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.386674 4985 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.386695 4985 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.386717 4985 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") pod \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\" (UID: \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\") " Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.386737 4985 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.386757 4985 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.386782 4985 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.386804 4985 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.386827 4985 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.386848 4985 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.386870 4985 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.386911 4985 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.386934 4985 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.386957 4985 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.386981 4985 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.387005 4985 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.387028 4985 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.387052 4985 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.387076 4985 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.387099 4985 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.387121 4985 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.387145 4985 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.387169 4985 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.387190 4985 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.387214 4985 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.387235 4985 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.387257 4985 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.386413 4985 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" (OuterVolumeSpecName: "kube-api-access-279lb") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "kube-api-access-279lb". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.386434 4985 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" (OuterVolumeSpecName: "config-volume") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.388630 4985 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" (OuterVolumeSpecName: "kube-api-access-wxkg8") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "kube-api-access-wxkg8". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.386467 4985 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" (OuterVolumeSpecName: "signing-key") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.386622 4985 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" (OuterVolumeSpecName: "kube-api-access-v47cf") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "kube-api-access-v47cf". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.386735 4985 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" (OuterVolumeSpecName: "config") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.386857 4985 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" (OuterVolumeSpecName: "kube-api-access-4d4hj") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "kube-api-access-4d4hj". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.387017 4985 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.387195 4985 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" (OuterVolumeSpecName: "webhook-certs") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "webhook-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.387247 4985 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" (OuterVolumeSpecName: "apiservice-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "apiservice-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.387249 4985 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.387246 4985 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" (OuterVolumeSpecName: "kube-api-access-9xfj7") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "kube-api-access-9xfj7". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.387257 4985 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 24 10:10:13 crc kubenswrapper[4985]: E0224 10:10:13.387276 4985 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-24 10:10:13.887259041 +0000 UTC m=+98.361451601 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.387343 4985 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" (OuterVolumeSpecName: "kube-api-access-d4lsv") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "kube-api-access-d4lsv". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.387388 4985 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.387477 4985 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.387472 4985 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" (OuterVolumeSpecName: "kube-api-access-7c4vf") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "kube-api-access-7c4vf". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.387528 4985 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" (OuterVolumeSpecName: "kube-api-access-cfbct") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "kube-api-access-cfbct". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.387525 4985 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.387697 4985 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.387712 4985 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" (OuterVolumeSpecName: "etcd-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.387744 4985 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.387897 4985 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.388222 4985 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" (OuterVolumeSpecName: "serviceca") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "serviceca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.388246 4985 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" (OuterVolumeSpecName: "tmpfs") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "tmpfs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.388307 4985 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" (OuterVolumeSpecName: "config") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.388392 4985 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" (OuterVolumeSpecName: "package-server-manager-serving-cert") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "package-server-manager-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.388458 4985 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.388557 4985 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.388668 4985 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.388847 4985 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.389077 4985 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" (OuterVolumeSpecName: "utilities") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.389635 4985 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" (OuterVolumeSpecName: "kube-api-access-249nr") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "kube-api-access-249nr". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.390763 4985 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.390762 4985 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" (OuterVolumeSpecName: "kube-api-access-2w9zh") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "kube-api-access-2w9zh". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.390252 4985 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" (OuterVolumeSpecName: "available-featuregates") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "available-featuregates". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.390531 4985 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.391013 4985 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.391086 4985 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" (OuterVolumeSpecName: "cert") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.391179 4985 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" (OuterVolumeSpecName: "kube-api-access-fqsjt") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "kube-api-access-fqsjt". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.389657 4985 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.389720 4985 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.391411 4985 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.391527 4985 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") pod \"44663579-783b-4372-86d6-acf235a62d72\" (UID: \"44663579-783b-4372-86d6-acf235a62d72\") " Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.391631 4985 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.391712 4985 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.391787 4985 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.391854 4985 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.391942 4985 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.392016 4985 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.392082 4985 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.392151 4985 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.392229 4985 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.392321 4985 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.392390 4985 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.392456 4985 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.392528 4985 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.392604 4985 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.392681 4985 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.391590 4985 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" (OuterVolumeSpecName: "config") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.391959 4985 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.392779 4985 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.393258 4985 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.393298 4985 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.393323 4985 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.393347 4985 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.393373 4985 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.391987 4985 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" (OuterVolumeSpecName: "cni-sysctl-allowlist") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-sysctl-allowlist". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.392200 4985 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" (OuterVolumeSpecName: "kube-api-access-zgdk5") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "kube-api-access-zgdk5". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.392171 4985 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.392300 4985 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.392652 4985 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" (OuterVolumeSpecName: "service-ca") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.392706 4985 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" (OuterVolumeSpecName: "kube-api-access-qg5z5") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "kube-api-access-qg5z5". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.392832 4985 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.392681 4985 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.393349 4985 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.393394 4985 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.393439 4985 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" (OuterVolumeSpecName: "kube-api-access-zkvpv") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "kube-api-access-zkvpv". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.393461 4985 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.393471 4985 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" (OuterVolumeSpecName: "config") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.393623 4985 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.393705 4985 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.393735 4985 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.393759 4985 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.393796 4985 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.393820 4985 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.393842 4985 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.393867 4985 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.393923 4985 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.393948 4985 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.393971 4985 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.393985 4985 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.394002 4985 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" (OuterVolumeSpecName: "kube-api-access-d6qdx") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "kube-api-access-d6qdx". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.393994 4985 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.394062 4985 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.394072 4985 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" (OuterVolumeSpecName: "images") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.394172 4985 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.394201 4985 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.394275 4985 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.394306 4985 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.394333 4985 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.394361 4985 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.394387 4985 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.394410 4985 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.394435 4985 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.394460 4985 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.394484 4985 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.394512 4985 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.394537 4985 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.394562 4985 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.394588 4985 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.394615 4985 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.394641 4985 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.394664 4985 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.394688 4985 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.394714 4985 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.394742 4985 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.394767 4985 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.394791 4985 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.394816 4985 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.394845 4985 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.394868 4985 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.394909 4985 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.394934 4985 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.394956 4985 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.394980 4985 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.395002 4985 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.395024 4985 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.395049 4985 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.395081 4985 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.395107 4985 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.395131 4985 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.394303 4985 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" (OuterVolumeSpecName: "kube-api-access-6g6sz") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "kube-api-access-6g6sz". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.394450 4985 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.394734 4985 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" (OuterVolumeSpecName: "kube-api-access-x2m85") pod "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" (UID: "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d"). InnerVolumeSpecName "kube-api-access-x2m85". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.394755 4985 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.395095 4985 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" (OuterVolumeSpecName: "kube-api-access-x4zgh") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "kube-api-access-x4zgh". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.396331 4985 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" (OuterVolumeSpecName: "kube-api-access-htfz6") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "kube-api-access-htfz6". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.396393 4985 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.396470 4985 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.396144 4985 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" (OuterVolumeSpecName: "etcd-service-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.396565 4985 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" (OuterVolumeSpecName: "kube-api-access-sb6h7") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "kube-api-access-sb6h7". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.396880 4985 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" (OuterVolumeSpecName: "mcd-auth-proxy-config") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "mcd-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.396882 4985 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" (OuterVolumeSpecName: "config") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.396924 4985 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.396932 4985 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.396943 4985 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" (OuterVolumeSpecName: "kube-api-access-rnphk") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "kube-api-access-rnphk". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.397123 4985 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.397233 4985 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.397247 4985 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.397285 4985 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.397293 4985 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" (OuterVolumeSpecName: "kube-api-access-xcgwh") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "kube-api-access-xcgwh". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.397486 4985 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" (OuterVolumeSpecName: "node-bootstrap-token") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "node-bootstrap-token". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.397606 4985 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.397715 4985 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" (OuterVolumeSpecName: "kube-api-access-nzwt7") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "kube-api-access-nzwt7". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.398377 4985 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" (OuterVolumeSpecName: "image-registry-operator-tls") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "image-registry-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.398400 4985 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" (OuterVolumeSpecName: "config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.398455 4985 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" (OuterVolumeSpecName: "console-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.398475 4985 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.397380 4985 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.398691 4985 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.398867 4985 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.399188 4985 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.399213 4985 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" (OuterVolumeSpecName: "config") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.399237 4985 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.399235 4985 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" (OuterVolumeSpecName: "config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.399264 4985 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" (OuterVolumeSpecName: "signing-cabundle") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-cabundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.399412 4985 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" (OuterVolumeSpecName: "ovn-control-plane-metrics-cert") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovn-control-plane-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.399499 4985 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.399420 4985 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.399558 4985 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" (OuterVolumeSpecName: "kube-api-access-vt5rc") pod "44663579-783b-4372-86d6-acf235a62d72" (UID: "44663579-783b-4372-86d6-acf235a62d72"). InnerVolumeSpecName "kube-api-access-vt5rc". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.399809 4985 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" (OuterVolumeSpecName: "kube-api-access-lzf88") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "kube-api-access-lzf88". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.399972 4985 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" (OuterVolumeSpecName: "client-ca") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.399997 4985 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.399430 4985 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.400147 4985 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.400216 4985 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.400532 4985 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.400567 4985 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" (OuterVolumeSpecName: "kube-api-access-bf2bz") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "kube-api-access-bf2bz". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.400818 4985 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.401099 4985 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.400994 4985 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-27dpt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1b3986ef-e9be-43db-9350-ccc7dd3f713f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9n6c9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9n6c9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9n6c9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9n6c9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9n6c9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9n6c9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9n6c9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9n6c9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9n6c9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:10:13Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-27dpt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.401919 4985 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.401399 4985 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" (OuterVolumeSpecName: "config") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.401401 4985 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.401409 4985 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" (OuterVolumeSpecName: "config") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.401436 4985 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.401766 4985 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.401791 4985 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" (OuterVolumeSpecName: "kube-api-access-qs4fp") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "kube-api-access-qs4fp". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.402260 4985 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" (OuterVolumeSpecName: "kube-api-access-w7l8j") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "kube-api-access-w7l8j". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.402357 4985 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.402453 4985 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.401803 4985 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.401833 4985 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.401835 4985 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" (OuterVolumeSpecName: "config") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.402874 4985 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" (OuterVolumeSpecName: "config") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.402926 4985 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" (OuterVolumeSpecName: "machine-approver-tls") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "machine-approver-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.402950 4985 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.402965 4985 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.403574 4985 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") pod \"49ef4625-1d3a-4a9f-b595-c2433d32326d\" (UID: \"49ef4625-1d3a-4a9f-b595-c2433d32326d\") " Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.404253 4985 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.404689 4985 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" (OuterVolumeSpecName: "webhook-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "webhook-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.404709 4985 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" (OuterVolumeSpecName: "utilities") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.404752 4985 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.404864 4985 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" (OuterVolumeSpecName: "kube-api-access-s4n52") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "kube-api-access-s4n52". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.404756 4985 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.404972 4985 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.405145 4985 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" (OuterVolumeSpecName: "machine-api-operator-tls") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "machine-api-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.405168 4985 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.405461 4985 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.405067 4985 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" (OuterVolumeSpecName: "kube-api-access-kfwg7") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "kube-api-access-kfwg7". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.405469 4985 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.405656 4985 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.405695 4985 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.405820 4985 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.405864 4985 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.405917 4985 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.405947 4985 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.405974 4985 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.406098 4985 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.406107 4985 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.406164 4985 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.406200 4985 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.406295 4985 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.406323 4985 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.406352 4985 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.406378 4985 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.406984 4985 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" (OuterVolumeSpecName: "kube-api-access-tk88c") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "kube-api-access-tk88c". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.407040 4985 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.407094 4985 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.407119 4985 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.407129 4985 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.407321 4985 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" (OuterVolumeSpecName: "kube-api-access-lz9wn") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "kube-api-access-lz9wn". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.407663 4985 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" (OuterVolumeSpecName: "kube-api-access-gf66m") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "kube-api-access-gf66m". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.408490 4985 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.407667 4985 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" (OuterVolumeSpecName: "kube-api-access-ngvvp") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "kube-api-access-ngvvp". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.408061 4985 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" (OuterVolumeSpecName: "service-ca") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.408584 4985 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" (OuterVolumeSpecName: "kube-api-access-dbsvg") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "kube-api-access-dbsvg". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.408285 4985 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.408338 4985 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" (OuterVolumeSpecName: "kube-api-access-2d4wz") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "kube-api-access-2d4wz". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.408355 4985 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" (OuterVolumeSpecName: "kube-api-access-6ccd8") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "kube-api-access-6ccd8". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.408381 4985 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.408512 4985 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" (OuterVolumeSpecName: "control-plane-machine-set-operator-tls") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "control-plane-machine-set-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.408807 4985 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/c3a34c00-910b-400b-bc96-9d805e076b7c-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-kkmm8\" (UID: \"c3a34c00-910b-400b-bc96-9d805e076b7c\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-kkmm8" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.408847 4985 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5g9sh\" (UniqueName: \"kubernetes.io/projected/731349d2-7b07-4bc9-81f8-c7d75bca842a-kube-api-access-5g9sh\") pod \"multus-q24bf\" (UID: \"731349d2-7b07-4bc9-81f8-c7d75bca842a\") " pod="openshift-multus/multus-q24bf" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.408873 4985 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-svj5z\" (UniqueName: \"kubernetes.io/projected/c4bc690c-38e6-488d-97b2-9bf37a916fe4-kube-api-access-svj5z\") pod \"multus-additional-cni-plugins-xj9h5\" (UID: \"c4bc690c-38e6-488d-97b2-9bf37a916fe4\") " pod="openshift-multus/multus-additional-cni-plugins-xj9h5" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.408927 4985 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/731349d2-7b07-4bc9-81f8-c7d75bca842a-system-cni-dir\") pod \"multus-q24bf\" (UID: \"731349d2-7b07-4bc9-81f8-c7d75bca842a\") " pod="openshift-multus/multus-q24bf" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.408950 4985 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/731349d2-7b07-4bc9-81f8-c7d75bca842a-host-run-multus-certs\") pod \"multus-q24bf\" (UID: \"731349d2-7b07-4bc9-81f8-c7d75bca842a\") " pod="openshift-multus/multus-q24bf" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.408978 4985 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/c4bc690c-38e6-488d-97b2-9bf37a916fe4-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-xj9h5\" (UID: \"c4bc690c-38e6-488d-97b2-9bf37a916fe4\") " pod="openshift-multus/multus-additional-cni-plugins-xj9h5" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.408981 4985 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" (OuterVolumeSpecName: "kube-api-access-pjr6v") pod "49ef4625-1d3a-4a9f-b595-c2433d32326d" (UID: "49ef4625-1d3a-4a9f-b595-c2433d32326d"). InnerVolumeSpecName "kube-api-access-pjr6v". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.409009 4985 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/1b3986ef-e9be-43db-9350-ccc7dd3f713f-host-slash\") pod \"ovnkube-node-27dpt\" (UID: \"1b3986ef-e9be-43db-9350-ccc7dd3f713f\") " pod="openshift-ovn-kubernetes/ovnkube-node-27dpt" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.409039 4985 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/1b3986ef-e9be-43db-9350-ccc7dd3f713f-host-cni-netd\") pod \"ovnkube-node-27dpt\" (UID: \"1b3986ef-e9be-43db-9350-ccc7dd3f713f\") " pod="openshift-ovn-kubernetes/ovnkube-node-27dpt" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.409082 4985 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.409092 4985 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.409112 4985 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" (OuterVolumeSpecName: "config") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.409118 4985 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.409127 4985 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/11c1c7b8-18df-4583-849f-76b62544344b-rootfs\") pod \"machine-config-daemon-hq52w\" (UID: \"11c1c7b8-18df-4583-849f-76b62544344b\") " pod="openshift-machine-config-operator/machine-config-daemon-hq52w" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.409164 4985 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" (OuterVolumeSpecName: "kube-api-access-x7zkh") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "kube-api-access-x7zkh". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.409280 4985 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d4340d1a-60cb-4240-87ba-1e468c9c41cf-metrics-certs\") pod \"network-metrics-daemon-xkc65\" (UID: \"d4340d1a-60cb-4240-87ba-1e468c9c41cf\") " pod="openshift-multus/network-metrics-daemon-xkc65" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.409322 4985 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/c43e3252-1b22-48a0-8895-ad98fc88b7cf-hosts-file\") pod \"node-resolver-qtqms\" (UID: \"c43e3252-1b22-48a0-8895-ad98fc88b7cf\") " pod="openshift-dns/node-resolver-qtqms" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.409352 4985 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/1b3986ef-e9be-43db-9350-ccc7dd3f713f-host-kubelet\") pod \"ovnkube-node-27dpt\" (UID: \"1b3986ef-e9be-43db-9350-ccc7dd3f713f\") " pod="openshift-ovn-kubernetes/ovnkube-node-27dpt" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.409382 4985 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/1b3986ef-e9be-43db-9350-ccc7dd3f713f-ovn-node-metrics-cert\") pod \"ovnkube-node-27dpt\" (UID: \"1b3986ef-e9be-43db-9350-ccc7dd3f713f\") " pod="openshift-ovn-kubernetes/ovnkube-node-27dpt" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.409423 4985 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" (OuterVolumeSpecName: "config") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.409425 4985 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/731349d2-7b07-4bc9-81f8-c7d75bca842a-multus-socket-dir-parent\") pod \"multus-q24bf\" (UID: \"731349d2-7b07-4bc9-81f8-c7d75bca842a\") " pod="openshift-multus/multus-q24bf" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.409474 4985 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/731349d2-7b07-4bc9-81f8-c7d75bca842a-host-var-lib-kubelet\") pod \"multus-q24bf\" (UID: \"731349d2-7b07-4bc9-81f8-c7d75bca842a\") " pod="openshift-multus/multus-q24bf" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.409520 4985 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.409534 4985 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" (OuterVolumeSpecName: "config") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.409554 4985 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/1b3986ef-e9be-43db-9350-ccc7dd3f713f-run-systemd\") pod \"ovnkube-node-27dpt\" (UID: \"1b3986ef-e9be-43db-9350-ccc7dd3f713f\") " pod="openshift-ovn-kubernetes/ovnkube-node-27dpt" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.409639 4985 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/1b3986ef-e9be-43db-9350-ccc7dd3f713f-var-lib-openvswitch\") pod \"ovnkube-node-27dpt\" (UID: \"1b3986ef-e9be-43db-9350-ccc7dd3f713f\") " pod="openshift-ovn-kubernetes/ovnkube-node-27dpt" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.409681 4985 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/1b3986ef-e9be-43db-9350-ccc7dd3f713f-ovnkube-config\") pod \"ovnkube-node-27dpt\" (UID: \"1b3986ef-e9be-43db-9350-ccc7dd3f713f\") " pod="openshift-ovn-kubernetes/ovnkube-node-27dpt" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.409718 4985 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/731349d2-7b07-4bc9-81f8-c7d75bca842a-cnibin\") pod \"multus-q24bf\" (UID: \"731349d2-7b07-4bc9-81f8-c7d75bca842a\") " pod="openshift-multus/multus-q24bf" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.409757 4985 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.409760 4985 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.409776 4985 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/c4bc690c-38e6-488d-97b2-9bf37a916fe4-os-release\") pod \"multus-additional-cni-plugins-xj9h5\" (UID: \"c4bc690c-38e6-488d-97b2-9bf37a916fe4\") " pod="openshift-multus/multus-additional-cni-plugins-xj9h5" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.409808 4985 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/1b3986ef-e9be-43db-9350-ccc7dd3f713f-systemd-units\") pod \"ovnkube-node-27dpt\" (UID: \"1b3986ef-e9be-43db-9350-ccc7dd3f713f\") " pod="openshift-ovn-kubernetes/ovnkube-node-27dpt" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.409926 4985 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.409992 4985 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/1b3986ef-e9be-43db-9350-ccc7dd3f713f-env-overrides\") pod \"ovnkube-node-27dpt\" (UID: \"1b3986ef-e9be-43db-9350-ccc7dd3f713f\") " pod="openshift-ovn-kubernetes/ovnkube-node-27dpt" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.410040 4985 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/4776cd42-9a5c-4c2e-a585-a1f4a49d2d6d-host\") pod \"node-ca-jj7jq\" (UID: \"4776cd42-9a5c-4c2e-a585-a1f4a49d2d6d\") " pod="openshift-image-registry/node-ca-jj7jq" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.410075 4985 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rvmf8\" (UniqueName: \"kubernetes.io/projected/4776cd42-9a5c-4c2e-a585-a1f4a49d2d6d-kube-api-access-rvmf8\") pod \"node-ca-jj7jq\" (UID: \"4776cd42-9a5c-4c2e-a585-a1f4a49d2d6d\") " pod="openshift-image-registry/node-ca-jj7jq" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.410117 4985 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/731349d2-7b07-4bc9-81f8-c7d75bca842a-multus-conf-dir\") pod \"multus-q24bf\" (UID: \"731349d2-7b07-4bc9-81f8-c7d75bca842a\") " pod="openshift-multus/multus-q24bf" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.410152 4985 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.410187 4985 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.410343 4985 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/c4bc690c-38e6-488d-97b2-9bf37a916fe4-system-cni-dir\") pod \"multus-additional-cni-plugins-xj9h5\" (UID: \"c4bc690c-38e6-488d-97b2-9bf37a916fe4\") " pod="openshift-multus/multus-additional-cni-plugins-xj9h5" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.410395 4985 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.410430 4985 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.410464 4985 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/731349d2-7b07-4bc9-81f8-c7d75bca842a-multus-cni-dir\") pod \"multus-q24bf\" (UID: \"731349d2-7b07-4bc9-81f8-c7d75bca842a\") " pod="openshift-multus/multus-q24bf" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.410551 4985 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.410593 4985 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/c4bc690c-38e6-488d-97b2-9bf37a916fe4-cni-binary-copy\") pod \"multus-additional-cni-plugins-xj9h5\" (UID: \"c4bc690c-38e6-488d-97b2-9bf37a916fe4\") " pod="openshift-multus/multus-additional-cni-plugins-xj9h5" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.410628 4985 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/1b3986ef-e9be-43db-9350-ccc7dd3f713f-run-openvswitch\") pod \"ovnkube-node-27dpt\" (UID: \"1b3986ef-e9be-43db-9350-ccc7dd3f713f\") " pod="openshift-ovn-kubernetes/ovnkube-node-27dpt" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.410657 4985 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/4776cd42-9a5c-4c2e-a585-a1f4a49d2d6d-serviceca\") pod \"node-ca-jj7jq\" (UID: \"4776cd42-9a5c-4c2e-a585-a1f4a49d2d6d\") " pod="openshift-image-registry/node-ca-jj7jq" Feb 24 10:10:13 crc kubenswrapper[4985]: E0224 10:10:13.410811 4985 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.410980 4985 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Feb 24 10:10:13 crc kubenswrapper[4985]: E0224 10:10:13.411012 4985 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-24 10:10:13.910984477 +0000 UTC m=+98.385177077 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.411058 4985 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kcfbt\" (UniqueName: \"kubernetes.io/projected/11c1c7b8-18df-4583-849f-76b62544344b-kube-api-access-kcfbt\") pod \"machine-config-daemon-hq52w\" (UID: \"11c1c7b8-18df-4583-849f-76b62544344b\") " pod="openshift-machine-config-operator/machine-config-daemon-hq52w" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.411107 4985 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.411145 4985 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/731349d2-7b07-4bc9-81f8-c7d75bca842a-cni-binary-copy\") pod \"multus-q24bf\" (UID: \"731349d2-7b07-4bc9-81f8-c7d75bca842a\") " pod="openshift-multus/multus-q24bf" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.411176 4985 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lvcfx\" (UniqueName: \"kubernetes.io/projected/d4340d1a-60cb-4240-87ba-1e468c9c41cf-kube-api-access-lvcfx\") pod \"network-metrics-daemon-xkc65\" (UID: \"d4340d1a-60cb-4240-87ba-1e468c9c41cf\") " pod="openshift-multus/network-metrics-daemon-xkc65" Feb 24 10:10:13 crc kubenswrapper[4985]: E0224 10:10:13.411190 4985 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.411206 4985 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/1b3986ef-e9be-43db-9350-ccc7dd3f713f-node-log\") pod \"ovnkube-node-27dpt\" (UID: \"1b3986ef-e9be-43db-9350-ccc7dd3f713f\") " pod="openshift-ovn-kubernetes/ovnkube-node-27dpt" Feb 24 10:10:13 crc kubenswrapper[4985]: E0224 10:10:13.411232 4985 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-24 10:10:13.911219264 +0000 UTC m=+98.385411884 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.411250 4985 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/1b3986ef-e9be-43db-9350-ccc7dd3f713f-log-socket\") pod \"ovnkube-node-27dpt\" (UID: \"1b3986ef-e9be-43db-9350-ccc7dd3f713f\") " pod="openshift-ovn-kubernetes/ovnkube-node-27dpt" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.411268 4985 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/1b3986ef-e9be-43db-9350-ccc7dd3f713f-host-cni-bin\") pod \"ovnkube-node-27dpt\" (UID: \"1b3986ef-e9be-43db-9350-ccc7dd3f713f\") " pod="openshift-ovn-kubernetes/ovnkube-node-27dpt" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.411284 4985 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/c3a34c00-910b-400b-bc96-9d805e076b7c-env-overrides\") pod \"ovnkube-control-plane-749d76644c-kkmm8\" (UID: \"c3a34c00-910b-400b-bc96-9d805e076b7c\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-kkmm8" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.411303 4985 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/731349d2-7b07-4bc9-81f8-c7d75bca842a-host-var-lib-cni-multus\") pod \"multus-q24bf\" (UID: \"731349d2-7b07-4bc9-81f8-c7d75bca842a\") " pod="openshift-multus/multus-q24bf" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.411322 4985 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.411337 4985 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.411353 4985 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/731349d2-7b07-4bc9-81f8-c7d75bca842a-etc-kubernetes\") pod \"multus-q24bf\" (UID: \"731349d2-7b07-4bc9-81f8-c7d75bca842a\") " pod="openshift-multus/multus-q24bf" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.411369 4985 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/1b3986ef-e9be-43db-9350-ccc7dd3f713f-etc-openvswitch\") pod \"ovnkube-node-27dpt\" (UID: \"1b3986ef-e9be-43db-9350-ccc7dd3f713f\") " pod="openshift-ovn-kubernetes/ovnkube-node-27dpt" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.411383 4985 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/1b3986ef-e9be-43db-9350-ccc7dd3f713f-host-run-ovn-kubernetes\") pod \"ovnkube-node-27dpt\" (UID: \"1b3986ef-e9be-43db-9350-ccc7dd3f713f\") " pod="openshift-ovn-kubernetes/ovnkube-node-27dpt" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.411399 4985 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/1b3986ef-e9be-43db-9350-ccc7dd3f713f-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-27dpt\" (UID: \"1b3986ef-e9be-43db-9350-ccc7dd3f713f\") " pod="openshift-ovn-kubernetes/ovnkube-node-27dpt" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.411415 4985 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9n6c9\" (UniqueName: \"kubernetes.io/projected/1b3986ef-e9be-43db-9350-ccc7dd3f713f-kube-api-access-9n6c9\") pod \"ovnkube-node-27dpt\" (UID: \"1b3986ef-e9be-43db-9350-ccc7dd3f713f\") " pod="openshift-ovn-kubernetes/ovnkube-node-27dpt" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.411429 4985 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/731349d2-7b07-4bc9-81f8-c7d75bca842a-os-release\") pod \"multus-q24bf\" (UID: \"731349d2-7b07-4bc9-81f8-c7d75bca842a\") " pod="openshift-multus/multus-q24bf" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.411444 4985 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/731349d2-7b07-4bc9-81f8-c7d75bca842a-multus-daemon-config\") pod \"multus-q24bf\" (UID: \"731349d2-7b07-4bc9-81f8-c7d75bca842a\") " pod="openshift-multus/multus-q24bf" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.411460 4985 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.411477 4985 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/731349d2-7b07-4bc9-81f8-c7d75bca842a-host-run-k8s-cni-cncf-io\") pod \"multus-q24bf\" (UID: \"731349d2-7b07-4bc9-81f8-c7d75bca842a\") " pod="openshift-multus/multus-q24bf" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.411496 4985 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/c4bc690c-38e6-488d-97b2-9bf37a916fe4-tuning-conf-dir\") pod \"multus-additional-cni-plugins-xj9h5\" (UID: \"c4bc690c-38e6-488d-97b2-9bf37a916fe4\") " pod="openshift-multus/multus-additional-cni-plugins-xj9h5" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.411516 4985 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/1b3986ef-e9be-43db-9350-ccc7dd3f713f-run-ovn\") pod \"ovnkube-node-27dpt\" (UID: \"1b3986ef-e9be-43db-9350-ccc7dd3f713f\") " pod="openshift-ovn-kubernetes/ovnkube-node-27dpt" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.411536 4985 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/11c1c7b8-18df-4583-849f-76b62544344b-proxy-tls\") pod \"machine-config-daemon-hq52w\" (UID: \"11c1c7b8-18df-4583-849f-76b62544344b\") " pod="openshift-machine-config-operator/machine-config-daemon-hq52w" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.411548 4985 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.411557 4985 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/731349d2-7b07-4bc9-81f8-c7d75bca842a-host-var-lib-cni-bin\") pod \"multus-q24bf\" (UID: \"731349d2-7b07-4bc9-81f8-c7d75bca842a\") " pod="openshift-multus/multus-q24bf" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.411645 4985 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jdldr\" (UniqueName: \"kubernetes.io/projected/c43e3252-1b22-48a0-8895-ad98fc88b7cf-kube-api-access-jdldr\") pod \"node-resolver-qtqms\" (UID: \"c43e3252-1b22-48a0-8895-ad98fc88b7cf\") " pod="openshift-dns/node-resolver-qtqms" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.411680 4985 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/c4bc690c-38e6-488d-97b2-9bf37a916fe4-cnibin\") pod \"multus-additional-cni-plugins-xj9h5\" (UID: \"c4bc690c-38e6-488d-97b2-9bf37a916fe4\") " pod="openshift-multus/multus-additional-cni-plugins-xj9h5" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.411737 4985 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.411761 4985 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/731349d2-7b07-4bc9-81f8-c7d75bca842a-hostroot\") pod \"multus-q24bf\" (UID: \"731349d2-7b07-4bc9-81f8-c7d75bca842a\") " pod="openshift-multus/multus-q24bf" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.411821 4985 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-qtqms" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c43e3252-1b22-48a0-8895-ad98fc88b7cf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jdldr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:10:13Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-qtqms\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.411866 4985 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/1b3986ef-e9be-43db-9350-ccc7dd3f713f-ovnkube-script-lib\") pod \"ovnkube-node-27dpt\" (UID: \"1b3986ef-e9be-43db-9350-ccc7dd3f713f\") " pod="openshift-ovn-kubernetes/ovnkube-node-27dpt" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.411922 4985 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.411947 4985 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/c3a34c00-910b-400b-bc96-9d805e076b7c-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-kkmm8\" (UID: \"c3a34c00-910b-400b-bc96-9d805e076b7c\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-kkmm8" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.411971 4985 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s2zg8\" (UniqueName: \"kubernetes.io/projected/c3a34c00-910b-400b-bc96-9d805e076b7c-kube-api-access-s2zg8\") pod \"ovnkube-control-plane-749d76644c-kkmm8\" (UID: \"c3a34c00-910b-400b-bc96-9d805e076b7c\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-kkmm8" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.412001 4985 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/11c1c7b8-18df-4583-849f-76b62544344b-mcd-auth-proxy-config\") pod \"machine-config-daemon-hq52w\" (UID: \"11c1c7b8-18df-4583-849f-76b62544344b\") " pod="openshift-machine-config-operator/machine-config-daemon-hq52w" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.412022 4985 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/731349d2-7b07-4bc9-81f8-c7d75bca842a-host-run-netns\") pod \"multus-q24bf\" (UID: \"731349d2-7b07-4bc9-81f8-c7d75bca842a\") " pod="openshift-multus/multus-q24bf" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.412044 4985 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/1b3986ef-e9be-43db-9350-ccc7dd3f713f-host-run-netns\") pod \"ovnkube-node-27dpt\" (UID: \"1b3986ef-e9be-43db-9350-ccc7dd3f713f\") " pod="openshift-ovn-kubernetes/ovnkube-node-27dpt" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.412161 4985 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") on node \"crc\" DevicePath \"\"" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.412177 4985 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") on node \"crc\" DevicePath \"\"" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.412191 4985 reconciler_common.go:293] "Volume detached for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") on node \"crc\" DevicePath \"\"" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.412206 4985 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.412219 4985 reconciler_common.go:293] "Volume detached for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") on node \"crc\" DevicePath \"\"" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.412232 4985 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.412245 4985 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") on node \"crc\" DevicePath \"\"" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.412259 4985 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.412277 4985 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") on node \"crc\" DevicePath \"\"" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.412290 4985 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") on node \"crc\" DevicePath \"\"" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.412303 4985 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") on node \"crc\" DevicePath \"\"" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.412315 4985 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") on node \"crc\" DevicePath \"\"" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.412330 4985 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") on node \"crc\" DevicePath \"\"" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.412351 4985 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") on node \"crc\" DevicePath \"\"" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.412368 4985 reconciler_common.go:293] "Volume detached for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") on node \"crc\" DevicePath \"\"" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.412385 4985 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") on node \"crc\" DevicePath \"\"" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.412400 4985 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") on node \"crc\" DevicePath \"\"" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.412411 4985 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.412424 4985 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") on node \"crc\" DevicePath \"\"" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.412438 4985 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") on node \"crc\" DevicePath \"\"" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.412449 4985 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") on node \"crc\" DevicePath \"\"" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.412461 4985 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") on node \"crc\" DevicePath \"\"" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.412472 4985 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") on node \"crc\" DevicePath \"\"" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.412484 4985 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") on node \"crc\" DevicePath \"\"" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.412496 4985 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.412508 4985 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.412522 4985 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") on node \"crc\" DevicePath \"\"" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.412534 4985 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.412545 4985 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") on node \"crc\" DevicePath \"\"" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.412559 4985 reconciler_common.go:293] "Volume detached for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") on node \"crc\" DevicePath \"\"" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.412576 4985 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") on node \"crc\" DevicePath \"\"" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.412594 4985 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") on node \"crc\" DevicePath \"\"" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.412609 4985 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") on node \"crc\" DevicePath \"\"" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.412621 4985 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.412634 4985 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") on node \"crc\" DevicePath \"\"" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.412648 4985 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.412660 4985 reconciler_common.go:293] "Volume detached for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.412673 4985 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") on node \"crc\" DevicePath \"\"" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.412684 4985 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") on node \"crc\" DevicePath \"\"" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.412696 4985 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.412708 4985 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.412720 4985 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") on node \"crc\" DevicePath \"\"" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.412732 4985 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.412745 4985 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.412757 4985 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") on node \"crc\" DevicePath \"\"" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.412770 4985 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") on node \"crc\" DevicePath \"\"" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.412783 4985 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") on node \"crc\" DevicePath \"\"" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.412797 4985 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.412809 4985 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") on node \"crc\" DevicePath \"\"" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.412823 4985 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") on node \"crc\" DevicePath \"\"" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.412834 4985 reconciler_common.go:293] "Volume detached for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") on node \"crc\" DevicePath \"\"" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.412847 4985 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") on node \"crc\" DevicePath \"\"" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.412860 4985 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") on node \"crc\" DevicePath \"\"" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.412875 4985 reconciler_common.go:293] "Volume detached for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") on node \"crc\" DevicePath \"\"" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.412917 4985 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.412931 4985 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") on node \"crc\" DevicePath \"\"" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.412945 4985 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") on node \"crc\" DevicePath \"\"" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.412959 4985 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") on node \"crc\" DevicePath \"\"" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.412971 4985 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") on node \"crc\" DevicePath \"\"" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.412985 4985 reconciler_common.go:293] "Volume detached for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") on node \"crc\" DevicePath \"\"" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.412997 4985 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.413008 4985 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") on node \"crc\" DevicePath \"\"" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.413021 4985 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.413033 4985 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") on node \"crc\" DevicePath \"\"" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.413044 4985 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.413056 4985 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") on node \"crc\" DevicePath \"\"" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.413069 4985 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") on node \"crc\" DevicePath \"\"" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.413081 4985 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") on node \"crc\" DevicePath \"\"" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.413092 4985 reconciler_common.go:293] "Volume detached for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") on node \"crc\" DevicePath \"\"" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.413104 4985 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") on node \"crc\" DevicePath \"\"" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.413118 4985 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") on node \"crc\" DevicePath \"\"" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.413131 4985 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.413145 4985 reconciler_common.go:293] "Volume detached for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") on node \"crc\" DevicePath \"\"" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.413158 4985 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") on node \"crc\" DevicePath \"\"" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.413171 4985 reconciler_common.go:293] "Volume detached for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") on node \"crc\" DevicePath \"\"" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.413183 4985 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") on node \"crc\" DevicePath \"\"" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.413194 4985 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") on node \"crc\" DevicePath \"\"" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.413174 4985 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.413206 4985 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.413218 4985 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.413239 4985 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") on node \"crc\" DevicePath \"\"" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.413281 4985 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.413290 4985 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.413302 4985 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") on node \"crc\" DevicePath \"\"" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.413937 4985 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") on node \"crc\" DevicePath \"\"" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.413958 4985 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") on node \"crc\" DevicePath \"\"" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.413971 4985 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") on node \"crc\" DevicePath \"\"" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.413984 4985 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.413997 4985 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.414010 4985 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.414023 4985 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") on node \"crc\" DevicePath \"\"" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.414037 4985 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") on node \"crc\" DevicePath \"\"" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.414049 4985 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.414062 4985 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") on node \"crc\" DevicePath \"\"" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.414073 4985 reconciler_common.go:293] "Volume detached for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") on node \"crc\" DevicePath \"\"" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.414085 4985 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") on node \"crc\" DevicePath \"\"" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.414097 4985 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") on node \"crc\" DevicePath \"\"" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.414109 4985 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") on node \"crc\" DevicePath \"\"" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.414121 4985 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.414133 4985 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") on node \"crc\" DevicePath \"\"" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.414144 4985 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") on node \"crc\" DevicePath \"\"" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.414156 4985 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") on node \"crc\" DevicePath \"\"" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.414168 4985 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") on node \"crc\" DevicePath \"\"" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.414180 4985 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") on node \"crc\" DevicePath \"\"" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.414192 4985 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.414203 4985 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") on node \"crc\" DevicePath \"\"" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.414216 4985 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") on node \"crc\" DevicePath \"\"" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.414228 4985 reconciler_common.go:293] "Volume detached for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") on node \"crc\" DevicePath \"\"" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.414242 4985 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") on node \"crc\" DevicePath \"\"" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.414253 4985 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") on node \"crc\" DevicePath \"\"" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.414265 4985 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") on node \"crc\" DevicePath \"\"" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.414276 4985 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") on node \"crc\" DevicePath \"\"" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.414288 4985 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") on node \"crc\" DevicePath \"\"" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.414299 4985 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") on node \"crc\" DevicePath \"\"" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.414310 4985 reconciler_common.go:293] "Volume detached for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") on node \"crc\" DevicePath \"\"" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.414327 4985 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") on node \"crc\" DevicePath \"\"" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.414339 4985 reconciler_common.go:293] "Volume detached for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") on node \"crc\" DevicePath \"\"" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.414351 4985 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") on node \"crc\" DevicePath \"\"" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.414363 4985 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") on node \"crc\" DevicePath \"\"" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.414375 4985 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") on node \"crc\" DevicePath \"\"" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.414387 4985 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.414401 4985 reconciler_common.go:293] "Volume detached for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") on node \"crc\" DevicePath \"\"" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.414414 4985 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") on node \"crc\" DevicePath \"\"" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.414428 4985 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") on node \"crc\" DevicePath \"\"" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.414439 4985 reconciler_common.go:293] "Volume detached for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") on node \"crc\" DevicePath \"\"" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.414451 4985 reconciler_common.go:293] "Volume detached for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") on node \"crc\" DevicePath \"\"" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.414463 4985 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") on node \"crc\" DevicePath \"\"" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.414474 4985 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") on node \"crc\" DevicePath \"\"" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.414486 4985 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") on node \"crc\" DevicePath \"\"" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.414497 4985 reconciler_common.go:293] "Volume detached for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") on node \"crc\" DevicePath \"\"" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.414511 4985 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") on node \"crc\" DevicePath \"\"" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.414523 4985 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.414534 4985 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") on node \"crc\" DevicePath \"\"" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.414547 4985 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") on node \"crc\" DevicePath \"\"" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.414561 4985 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.414617 4985 reconciler_common.go:293] "Volume detached for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") on node \"crc\" DevicePath \"\"" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.414633 4985 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") on node \"crc\" DevicePath \"\"" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.414646 4985 reconciler_common.go:293] "Volume detached for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") on node \"crc\" DevicePath \"\"" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.414657 4985 reconciler_common.go:293] "Volume detached for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") on node \"crc\" DevicePath \"\"" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.414669 4985 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") on node \"crc\" DevicePath \"\"" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.414681 4985 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.414695 4985 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.414708 4985 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") on node \"crc\" DevicePath \"\"" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.414721 4985 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") on node \"crc\" DevicePath \"\"" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.414733 4985 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") on node \"crc\" DevicePath \"\"" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.414744 4985 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.414756 4985 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.414769 4985 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") on node \"crc\" DevicePath \"\"" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.414780 4985 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") on node \"crc\" DevicePath \"\"" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.414791 4985 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.414806 4985 reconciler_common.go:293] "Volume detached for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") on node \"crc\" DevicePath \"\"" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.414817 4985 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") on node \"crc\" DevicePath \"\"" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.414828 4985 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") on node \"crc\" DevicePath \"\"" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.414847 4985 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") on node \"crc\" DevicePath \"\"" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.414859 4985 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") on node \"crc\" DevicePath \"\"" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.414870 4985 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") on node \"crc\" DevicePath \"\"" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.414910 4985 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") on node \"crc\" DevicePath \"\"" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.414922 4985 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") on node \"crc\" DevicePath \"\"" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.414934 4985 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") on node \"crc\" DevicePath \"\"" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.414946 4985 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") on node \"crc\" DevicePath \"\"" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.414957 4985 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.414970 4985 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") on node \"crc\" DevicePath \"\"" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.414981 4985 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") on node \"crc\" DevicePath \"\"" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.414992 4985 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") on node \"crc\" DevicePath \"\"" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.415004 4985 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.415016 4985 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.415028 4985 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.415039 4985 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.415051 4985 reconciler_common.go:293] "Volume detached for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.415062 4985 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.415076 4985 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") on node \"crc\" DevicePath \"\"" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.415087 4985 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") on node \"crc\" DevicePath \"\"" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.415102 4985 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") on node \"crc\" DevicePath \"\"" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.415116 4985 reconciler_common.go:293] "Volume detached for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") on node \"crc\" DevicePath \"\"" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.415128 4985 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") on node \"crc\" DevicePath \"\"" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.415139 4985 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.415150 4985 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.415161 4985 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") on node \"crc\" DevicePath \"\"" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.415173 4985 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") on node \"crc\" DevicePath \"\"" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.417258 4985 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 24 10:10:13 crc kubenswrapper[4985]: E0224 10:10:13.425211 4985 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 24 10:10:13 crc kubenswrapper[4985]: E0224 10:10:13.425247 4985 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 24 10:10:13 crc kubenswrapper[4985]: E0224 10:10:13.425262 4985 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.425344 4985 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 24 10:10:13 crc kubenswrapper[4985]: E0224 10:10:13.425472 4985 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-02-24 10:10:13.925308695 +0000 UTC m=+98.399501255 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.425734 4985 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.426717 4985 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Feb 24 10:10:13 crc kubenswrapper[4985]: E0224 10:10:13.427584 4985 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 24 10:10:13 crc kubenswrapper[4985]: E0224 10:10:13.427680 4985 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 24 10:10:13 crc kubenswrapper[4985]: E0224 10:10:13.427757 4985 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 24 10:10:13 crc kubenswrapper[4985]: E0224 10:10:13.427871 4985 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-02-24 10:10:13.927850707 +0000 UTC m=+98.402043267 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.428544 4985 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" (OuterVolumeSpecName: "utilities") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.428573 4985 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" (OuterVolumeSpecName: "stats-auth") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "stats-auth". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.428969 4985 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" (OuterVolumeSpecName: "image-import-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "image-import-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.429622 4985 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" (OuterVolumeSpecName: "kube-api-access-pcxfs") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "kube-api-access-pcxfs". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.429950 4985 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.431444 4985 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" (OuterVolumeSpecName: "mcc-auth-proxy-config") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "mcc-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.432058 4985 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.432125 4985 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.432221 4985 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.432397 4985 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.433549 4985 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.433803 4985 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" (OuterVolumeSpecName: "kube-api-access-8tdtz") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "kube-api-access-8tdtz". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.433952 4985 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.434252 4985 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.440051 4985 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-kkmm8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c3a34c00-910b-400b-bc96-9d805e076b7c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2zg8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2zg8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:10:13Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-kkmm8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.441307 4985 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.444939 4985 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.446203 4985 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.450937 4985 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.453181 4985 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.459186 4985 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.460139 4985 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.470635 4985 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-q24bf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"731349d2-7b07-4bc9-81f8-c7d75bca842a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5g9sh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:10:13Z\\\"}}\" for pod \"openshift-multus\"/\"multus-q24bf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.483854 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.483905 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.483915 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.483930 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.483940 4985 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T10:10:13Z","lastTransitionTime":"2026-02-24T10:10:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.516621 4985 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/c3a34c00-910b-400b-bc96-9d805e076b7c-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-kkmm8\" (UID: \"c3a34c00-910b-400b-bc96-9d805e076b7c\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-kkmm8" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.516669 4985 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5g9sh\" (UniqueName: \"kubernetes.io/projected/731349d2-7b07-4bc9-81f8-c7d75bca842a-kube-api-access-5g9sh\") pod \"multus-q24bf\" (UID: \"731349d2-7b07-4bc9-81f8-c7d75bca842a\") " pod="openshift-multus/multus-q24bf" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.516695 4985 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-svj5z\" (UniqueName: \"kubernetes.io/projected/c4bc690c-38e6-488d-97b2-9bf37a916fe4-kube-api-access-svj5z\") pod \"multus-additional-cni-plugins-xj9h5\" (UID: \"c4bc690c-38e6-488d-97b2-9bf37a916fe4\") " pod="openshift-multus/multus-additional-cni-plugins-xj9h5" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.516715 4985 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/731349d2-7b07-4bc9-81f8-c7d75bca842a-system-cni-dir\") pod \"multus-q24bf\" (UID: \"731349d2-7b07-4bc9-81f8-c7d75bca842a\") " pod="openshift-multus/multus-q24bf" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.516735 4985 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/731349d2-7b07-4bc9-81f8-c7d75bca842a-host-run-multus-certs\") pod \"multus-q24bf\" (UID: \"731349d2-7b07-4bc9-81f8-c7d75bca842a\") " pod="openshift-multus/multus-q24bf" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.516755 4985 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/c4bc690c-38e6-488d-97b2-9bf37a916fe4-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-xj9h5\" (UID: \"c4bc690c-38e6-488d-97b2-9bf37a916fe4\") " pod="openshift-multus/multus-additional-cni-plugins-xj9h5" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.516774 4985 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/1b3986ef-e9be-43db-9350-ccc7dd3f713f-host-slash\") pod \"ovnkube-node-27dpt\" (UID: \"1b3986ef-e9be-43db-9350-ccc7dd3f713f\") " pod="openshift-ovn-kubernetes/ovnkube-node-27dpt" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.516797 4985 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/1b3986ef-e9be-43db-9350-ccc7dd3f713f-host-cni-netd\") pod \"ovnkube-node-27dpt\" (UID: \"1b3986ef-e9be-43db-9350-ccc7dd3f713f\") " pod="openshift-ovn-kubernetes/ovnkube-node-27dpt" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.516818 4985 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.516838 4985 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/11c1c7b8-18df-4583-849f-76b62544344b-rootfs\") pod \"machine-config-daemon-hq52w\" (UID: \"11c1c7b8-18df-4583-849f-76b62544344b\") " pod="openshift-machine-config-operator/machine-config-daemon-hq52w" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.516860 4985 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d4340d1a-60cb-4240-87ba-1e468c9c41cf-metrics-certs\") pod \"network-metrics-daemon-xkc65\" (UID: \"d4340d1a-60cb-4240-87ba-1e468c9c41cf\") " pod="openshift-multus/network-metrics-daemon-xkc65" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.516874 4985 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/731349d2-7b07-4bc9-81f8-c7d75bca842a-host-run-multus-certs\") pod \"multus-q24bf\" (UID: \"731349d2-7b07-4bc9-81f8-c7d75bca842a\") " pod="openshift-multus/multus-q24bf" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.516979 4985 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/11c1c7b8-18df-4583-849f-76b62544344b-rootfs\") pod \"machine-config-daemon-hq52w\" (UID: \"11c1c7b8-18df-4583-849f-76b62544344b\") " pod="openshift-machine-config-operator/machine-config-daemon-hq52w" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.517053 4985 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.517074 4985 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/731349d2-7b07-4bc9-81f8-c7d75bca842a-system-cni-dir\") pod \"multus-q24bf\" (UID: \"731349d2-7b07-4bc9-81f8-c7d75bca842a\") " pod="openshift-multus/multus-q24bf" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.517084 4985 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/1b3986ef-e9be-43db-9350-ccc7dd3f713f-host-cni-netd\") pod \"ovnkube-node-27dpt\" (UID: \"1b3986ef-e9be-43db-9350-ccc7dd3f713f\") " pod="openshift-ovn-kubernetes/ovnkube-node-27dpt" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.516880 4985 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/c43e3252-1b22-48a0-8895-ad98fc88b7cf-hosts-file\") pod \"node-resolver-qtqms\" (UID: \"c43e3252-1b22-48a0-8895-ad98fc88b7cf\") " pod="openshift-dns/node-resolver-qtqms" Feb 24 10:10:13 crc kubenswrapper[4985]: E0224 10:10:13.517138 4985 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Feb 24 10:10:13 crc kubenswrapper[4985]: E0224 10:10:13.517227 4985 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d4340d1a-60cb-4240-87ba-1e468c9c41cf-metrics-certs podName:d4340d1a-60cb-4240-87ba-1e468c9c41cf nodeName:}" failed. No retries permitted until 2026-02-24 10:10:14.017204681 +0000 UTC m=+98.491397281 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/d4340d1a-60cb-4240-87ba-1e468c9c41cf-metrics-certs") pod "network-metrics-daemon-xkc65" (UID: "d4340d1a-60cb-4240-87ba-1e468c9c41cf") : object "openshift-multus"/"metrics-daemon-secret" not registered Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.517266 4985 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/1b3986ef-e9be-43db-9350-ccc7dd3f713f-host-kubelet\") pod \"ovnkube-node-27dpt\" (UID: \"1b3986ef-e9be-43db-9350-ccc7dd3f713f\") " pod="openshift-ovn-kubernetes/ovnkube-node-27dpt" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.517225 4985 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/c43e3252-1b22-48a0-8895-ad98fc88b7cf-hosts-file\") pod \"node-resolver-qtqms\" (UID: \"c43e3252-1b22-48a0-8895-ad98fc88b7cf\") " pod="openshift-dns/node-resolver-qtqms" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.517311 4985 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/1b3986ef-e9be-43db-9350-ccc7dd3f713f-ovn-node-metrics-cert\") pod \"ovnkube-node-27dpt\" (UID: \"1b3986ef-e9be-43db-9350-ccc7dd3f713f\") " pod="openshift-ovn-kubernetes/ovnkube-node-27dpt" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.517402 4985 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/1b3986ef-e9be-43db-9350-ccc7dd3f713f-ovnkube-config\") pod \"ovnkube-node-27dpt\" (UID: \"1b3986ef-e9be-43db-9350-ccc7dd3f713f\") " pod="openshift-ovn-kubernetes/ovnkube-node-27dpt" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.517506 4985 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/1b3986ef-e9be-43db-9350-ccc7dd3f713f-host-kubelet\") pod \"ovnkube-node-27dpt\" (UID: \"1b3986ef-e9be-43db-9350-ccc7dd3f713f\") " pod="openshift-ovn-kubernetes/ovnkube-node-27dpt" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.517601 4985 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/1b3986ef-e9be-43db-9350-ccc7dd3f713f-host-slash\") pod \"ovnkube-node-27dpt\" (UID: \"1b3986ef-e9be-43db-9350-ccc7dd3f713f\") " pod="openshift-ovn-kubernetes/ovnkube-node-27dpt" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.517868 4985 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/731349d2-7b07-4bc9-81f8-c7d75bca842a-multus-socket-dir-parent\") pod \"multus-q24bf\" (UID: \"731349d2-7b07-4bc9-81f8-c7d75bca842a\") " pod="openshift-multus/multus-q24bf" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.518219 4985 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/c4bc690c-38e6-488d-97b2-9bf37a916fe4-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-xj9h5\" (UID: \"c4bc690c-38e6-488d-97b2-9bf37a916fe4\") " pod="openshift-multus/multus-additional-cni-plugins-xj9h5" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.518446 4985 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/1b3986ef-e9be-43db-9350-ccc7dd3f713f-ovnkube-config\") pod \"ovnkube-node-27dpt\" (UID: \"1b3986ef-e9be-43db-9350-ccc7dd3f713f\") " pod="openshift-ovn-kubernetes/ovnkube-node-27dpt" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.518727 4985 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/731349d2-7b07-4bc9-81f8-c7d75bca842a-multus-socket-dir-parent\") pod \"multus-q24bf\" (UID: \"731349d2-7b07-4bc9-81f8-c7d75bca842a\") " pod="openshift-multus/multus-q24bf" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.518974 4985 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/731349d2-7b07-4bc9-81f8-c7d75bca842a-host-var-lib-kubelet\") pod \"multus-q24bf\" (UID: \"731349d2-7b07-4bc9-81f8-c7d75bca842a\") " pod="openshift-multus/multus-q24bf" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.519076 4985 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/1b3986ef-e9be-43db-9350-ccc7dd3f713f-run-systemd\") pod \"ovnkube-node-27dpt\" (UID: \"1b3986ef-e9be-43db-9350-ccc7dd3f713f\") " pod="openshift-ovn-kubernetes/ovnkube-node-27dpt" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.519133 4985 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/1b3986ef-e9be-43db-9350-ccc7dd3f713f-var-lib-openvswitch\") pod \"ovnkube-node-27dpt\" (UID: \"1b3986ef-e9be-43db-9350-ccc7dd3f713f\") " pod="openshift-ovn-kubernetes/ovnkube-node-27dpt" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.519143 4985 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/1b3986ef-e9be-43db-9350-ccc7dd3f713f-run-systemd\") pod \"ovnkube-node-27dpt\" (UID: \"1b3986ef-e9be-43db-9350-ccc7dd3f713f\") " pod="openshift-ovn-kubernetes/ovnkube-node-27dpt" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.519101 4985 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/731349d2-7b07-4bc9-81f8-c7d75bca842a-host-var-lib-kubelet\") pod \"multus-q24bf\" (UID: \"731349d2-7b07-4bc9-81f8-c7d75bca842a\") " pod="openshift-multus/multus-q24bf" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.519180 4985 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/4776cd42-9a5c-4c2e-a585-a1f4a49d2d6d-host\") pod \"node-ca-jj7jq\" (UID: \"4776cd42-9a5c-4c2e-a585-a1f4a49d2d6d\") " pod="openshift-image-registry/node-ca-jj7jq" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.519227 4985 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rvmf8\" (UniqueName: \"kubernetes.io/projected/4776cd42-9a5c-4c2e-a585-a1f4a49d2d6d-kube-api-access-rvmf8\") pod \"node-ca-jj7jq\" (UID: \"4776cd42-9a5c-4c2e-a585-a1f4a49d2d6d\") " pod="openshift-image-registry/node-ca-jj7jq" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.519250 4985 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/731349d2-7b07-4bc9-81f8-c7d75bca842a-cnibin\") pod \"multus-q24bf\" (UID: \"731349d2-7b07-4bc9-81f8-c7d75bca842a\") " pod="openshift-multus/multus-q24bf" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.519271 4985 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/c4bc690c-38e6-488d-97b2-9bf37a916fe4-os-release\") pod \"multus-additional-cni-plugins-xj9h5\" (UID: \"c4bc690c-38e6-488d-97b2-9bf37a916fe4\") " pod="openshift-multus/multus-additional-cni-plugins-xj9h5" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.519196 4985 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/1b3986ef-e9be-43db-9350-ccc7dd3f713f-var-lib-openvswitch\") pod \"ovnkube-node-27dpt\" (UID: \"1b3986ef-e9be-43db-9350-ccc7dd3f713f\") " pod="openshift-ovn-kubernetes/ovnkube-node-27dpt" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.519318 4985 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/1b3986ef-e9be-43db-9350-ccc7dd3f713f-systemd-units\") pod \"ovnkube-node-27dpt\" (UID: \"1b3986ef-e9be-43db-9350-ccc7dd3f713f\") " pod="openshift-ovn-kubernetes/ovnkube-node-27dpt" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.519357 4985 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/1b3986ef-e9be-43db-9350-ccc7dd3f713f-env-overrides\") pod \"ovnkube-node-27dpt\" (UID: \"1b3986ef-e9be-43db-9350-ccc7dd3f713f\") " pod="openshift-ovn-kubernetes/ovnkube-node-27dpt" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.519383 4985 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/731349d2-7b07-4bc9-81f8-c7d75bca842a-multus-conf-dir\") pod \"multus-q24bf\" (UID: \"731349d2-7b07-4bc9-81f8-c7d75bca842a\") " pod="openshift-multus/multus-q24bf" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.519399 4985 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/c3a34c00-910b-400b-bc96-9d805e076b7c-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-kkmm8\" (UID: \"c3a34c00-910b-400b-bc96-9d805e076b7c\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-kkmm8" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.519412 4985 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/c4bc690c-38e6-488d-97b2-9bf37a916fe4-system-cni-dir\") pod \"multus-additional-cni-plugins-xj9h5\" (UID: \"c4bc690c-38e6-488d-97b2-9bf37a916fe4\") " pod="openshift-multus/multus-additional-cni-plugins-xj9h5" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.519435 4985 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/1b3986ef-e9be-43db-9350-ccc7dd3f713f-systemd-units\") pod \"ovnkube-node-27dpt\" (UID: \"1b3986ef-e9be-43db-9350-ccc7dd3f713f\") " pod="openshift-ovn-kubernetes/ovnkube-node-27dpt" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.519464 4985 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/731349d2-7b07-4bc9-81f8-c7d75bca842a-multus-cni-dir\") pod \"multus-q24bf\" (UID: \"731349d2-7b07-4bc9-81f8-c7d75bca842a\") " pod="openshift-multus/multus-q24bf" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.519429 4985 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/4776cd42-9a5c-4c2e-a585-a1f4a49d2d6d-host\") pod \"node-ca-jj7jq\" (UID: \"4776cd42-9a5c-4c2e-a585-a1f4a49d2d6d\") " pod="openshift-image-registry/node-ca-jj7jq" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.519494 4985 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/c4bc690c-38e6-488d-97b2-9bf37a916fe4-cni-binary-copy\") pod \"multus-additional-cni-plugins-xj9h5\" (UID: \"c4bc690c-38e6-488d-97b2-9bf37a916fe4\") " pod="openshift-multus/multus-additional-cni-plugins-xj9h5" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.519527 4985 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/731349d2-7b07-4bc9-81f8-c7d75bca842a-cnibin\") pod \"multus-q24bf\" (UID: \"731349d2-7b07-4bc9-81f8-c7d75bca842a\") " pod="openshift-multus/multus-q24bf" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.519552 4985 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/1b3986ef-e9be-43db-9350-ccc7dd3f713f-run-openvswitch\") pod \"ovnkube-node-27dpt\" (UID: \"1b3986ef-e9be-43db-9350-ccc7dd3f713f\") " pod="openshift-ovn-kubernetes/ovnkube-node-27dpt" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.519585 4985 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/c4bc690c-38e6-488d-97b2-9bf37a916fe4-os-release\") pod \"multus-additional-cni-plugins-xj9h5\" (UID: \"c4bc690c-38e6-488d-97b2-9bf37a916fe4\") " pod="openshift-multus/multus-additional-cni-plugins-xj9h5" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.519478 4985 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/731349d2-7b07-4bc9-81f8-c7d75bca842a-multus-conf-dir\") pod \"multus-q24bf\" (UID: \"731349d2-7b07-4bc9-81f8-c7d75bca842a\") " pod="openshift-multus/multus-q24bf" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.519444 4985 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/c4bc690c-38e6-488d-97b2-9bf37a916fe4-system-cni-dir\") pod \"multus-additional-cni-plugins-xj9h5\" (UID: \"c4bc690c-38e6-488d-97b2-9bf37a916fe4\") " pod="openshift-multus/multus-additional-cni-plugins-xj9h5" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.519708 4985 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/731349d2-7b07-4bc9-81f8-c7d75bca842a-multus-cni-dir\") pod \"multus-q24bf\" (UID: \"731349d2-7b07-4bc9-81f8-c7d75bca842a\") " pod="openshift-multus/multus-q24bf" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.519803 4985 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/4776cd42-9a5c-4c2e-a585-a1f4a49d2d6d-serviceca\") pod \"node-ca-jj7jq\" (UID: \"4776cd42-9a5c-4c2e-a585-a1f4a49d2d6d\") " pod="openshift-image-registry/node-ca-jj7jq" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.519845 4985 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/1b3986ef-e9be-43db-9350-ccc7dd3f713f-log-socket\") pod \"ovnkube-node-27dpt\" (UID: \"1b3986ef-e9be-43db-9350-ccc7dd3f713f\") " pod="openshift-ovn-kubernetes/ovnkube-node-27dpt" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.519853 4985 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/1b3986ef-e9be-43db-9350-ccc7dd3f713f-run-openvswitch\") pod \"ovnkube-node-27dpt\" (UID: \"1b3986ef-e9be-43db-9350-ccc7dd3f713f\") " pod="openshift-ovn-kubernetes/ovnkube-node-27dpt" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.519905 4985 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/1b3986ef-e9be-43db-9350-ccc7dd3f713f-host-cni-bin\") pod \"ovnkube-node-27dpt\" (UID: \"1b3986ef-e9be-43db-9350-ccc7dd3f713f\") " pod="openshift-ovn-kubernetes/ovnkube-node-27dpt" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.519947 4985 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kcfbt\" (UniqueName: \"kubernetes.io/projected/11c1c7b8-18df-4583-849f-76b62544344b-kube-api-access-kcfbt\") pod \"machine-config-daemon-hq52w\" (UID: \"11c1c7b8-18df-4583-849f-76b62544344b\") " pod="openshift-machine-config-operator/machine-config-daemon-hq52w" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.519997 4985 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/731349d2-7b07-4bc9-81f8-c7d75bca842a-cni-binary-copy\") pod \"multus-q24bf\" (UID: \"731349d2-7b07-4bc9-81f8-c7d75bca842a\") " pod="openshift-multus/multus-q24bf" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.520027 4985 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lvcfx\" (UniqueName: \"kubernetes.io/projected/d4340d1a-60cb-4240-87ba-1e468c9c41cf-kube-api-access-lvcfx\") pod \"network-metrics-daemon-xkc65\" (UID: \"d4340d1a-60cb-4240-87ba-1e468c9c41cf\") " pod="openshift-multus/network-metrics-daemon-xkc65" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.520054 4985 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/c4bc690c-38e6-488d-97b2-9bf37a916fe4-cni-binary-copy\") pod \"multus-additional-cni-plugins-xj9h5\" (UID: \"c4bc690c-38e6-488d-97b2-9bf37a916fe4\") " pod="openshift-multus/multus-additional-cni-plugins-xj9h5" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.520094 4985 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/1b3986ef-e9be-43db-9350-ccc7dd3f713f-node-log\") pod \"ovnkube-node-27dpt\" (UID: \"1b3986ef-e9be-43db-9350-ccc7dd3f713f\") " pod="openshift-ovn-kubernetes/ovnkube-node-27dpt" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.520104 4985 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/1b3986ef-e9be-43db-9350-ccc7dd3f713f-log-socket\") pod \"ovnkube-node-27dpt\" (UID: \"1b3986ef-e9be-43db-9350-ccc7dd3f713f\") " pod="openshift-ovn-kubernetes/ovnkube-node-27dpt" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.520055 4985 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/1b3986ef-e9be-43db-9350-ccc7dd3f713f-node-log\") pod \"ovnkube-node-27dpt\" (UID: \"1b3986ef-e9be-43db-9350-ccc7dd3f713f\") " pod="openshift-ovn-kubernetes/ovnkube-node-27dpt" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.520159 4985 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/c3a34c00-910b-400b-bc96-9d805e076b7c-env-overrides\") pod \"ovnkube-control-plane-749d76644c-kkmm8\" (UID: \"c3a34c00-910b-400b-bc96-9d805e076b7c\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-kkmm8" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.520194 4985 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/731349d2-7b07-4bc9-81f8-c7d75bca842a-host-var-lib-cni-multus\") pod \"multus-q24bf\" (UID: \"731349d2-7b07-4bc9-81f8-c7d75bca842a\") " pod="openshift-multus/multus-q24bf" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.520229 4985 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/731349d2-7b07-4bc9-81f8-c7d75bca842a-etc-kubernetes\") pod \"multus-q24bf\" (UID: \"731349d2-7b07-4bc9-81f8-c7d75bca842a\") " pod="openshift-multus/multus-q24bf" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.520259 4985 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/1b3986ef-e9be-43db-9350-ccc7dd3f713f-etc-openvswitch\") pod \"ovnkube-node-27dpt\" (UID: \"1b3986ef-e9be-43db-9350-ccc7dd3f713f\") " pod="openshift-ovn-kubernetes/ovnkube-node-27dpt" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.520272 4985 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/1b3986ef-e9be-43db-9350-ccc7dd3f713f-host-cni-bin\") pod \"ovnkube-node-27dpt\" (UID: \"1b3986ef-e9be-43db-9350-ccc7dd3f713f\") " pod="openshift-ovn-kubernetes/ovnkube-node-27dpt" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.520292 4985 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/1b3986ef-e9be-43db-9350-ccc7dd3f713f-host-run-ovn-kubernetes\") pod \"ovnkube-node-27dpt\" (UID: \"1b3986ef-e9be-43db-9350-ccc7dd3f713f\") " pod="openshift-ovn-kubernetes/ovnkube-node-27dpt" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.520315 4985 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/731349d2-7b07-4bc9-81f8-c7d75bca842a-host-var-lib-cni-multus\") pod \"multus-q24bf\" (UID: \"731349d2-7b07-4bc9-81f8-c7d75bca842a\") " pod="openshift-multus/multus-q24bf" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.520327 4985 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/1b3986ef-e9be-43db-9350-ccc7dd3f713f-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-27dpt\" (UID: \"1b3986ef-e9be-43db-9350-ccc7dd3f713f\") " pod="openshift-ovn-kubernetes/ovnkube-node-27dpt" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.520361 4985 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9n6c9\" (UniqueName: \"kubernetes.io/projected/1b3986ef-e9be-43db-9350-ccc7dd3f713f-kube-api-access-9n6c9\") pod \"ovnkube-node-27dpt\" (UID: \"1b3986ef-e9be-43db-9350-ccc7dd3f713f\") " pod="openshift-ovn-kubernetes/ovnkube-node-27dpt" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.520448 4985 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/731349d2-7b07-4bc9-81f8-c7d75bca842a-os-release\") pod \"multus-q24bf\" (UID: \"731349d2-7b07-4bc9-81f8-c7d75bca842a\") " pod="openshift-multus/multus-q24bf" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.520479 4985 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/731349d2-7b07-4bc9-81f8-c7d75bca842a-multus-daemon-config\") pod \"multus-q24bf\" (UID: \"731349d2-7b07-4bc9-81f8-c7d75bca842a\") " pod="openshift-multus/multus-q24bf" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.520524 4985 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/731349d2-7b07-4bc9-81f8-c7d75bca842a-host-run-k8s-cni-cncf-io\") pod \"multus-q24bf\" (UID: \"731349d2-7b07-4bc9-81f8-c7d75bca842a\") " pod="openshift-multus/multus-q24bf" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.520556 4985 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/c4bc690c-38e6-488d-97b2-9bf37a916fe4-tuning-conf-dir\") pod \"multus-additional-cni-plugins-xj9h5\" (UID: \"c4bc690c-38e6-488d-97b2-9bf37a916fe4\") " pod="openshift-multus/multus-additional-cni-plugins-xj9h5" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.520586 4985 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/1b3986ef-e9be-43db-9350-ccc7dd3f713f-run-ovn\") pod \"ovnkube-node-27dpt\" (UID: \"1b3986ef-e9be-43db-9350-ccc7dd3f713f\") " pod="openshift-ovn-kubernetes/ovnkube-node-27dpt" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.520617 4985 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/11c1c7b8-18df-4583-849f-76b62544344b-proxy-tls\") pod \"machine-config-daemon-hq52w\" (UID: \"11c1c7b8-18df-4583-849f-76b62544344b\") " pod="openshift-machine-config-operator/machine-config-daemon-hq52w" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.520646 4985 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/731349d2-7b07-4bc9-81f8-c7d75bca842a-host-var-lib-cni-bin\") pod \"multus-q24bf\" (UID: \"731349d2-7b07-4bc9-81f8-c7d75bca842a\") " pod="openshift-multus/multus-q24bf" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.520679 4985 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jdldr\" (UniqueName: \"kubernetes.io/projected/c43e3252-1b22-48a0-8895-ad98fc88b7cf-kube-api-access-jdldr\") pod \"node-resolver-qtqms\" (UID: \"c43e3252-1b22-48a0-8895-ad98fc88b7cf\") " pod="openshift-dns/node-resolver-qtqms" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.520709 4985 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/c4bc690c-38e6-488d-97b2-9bf37a916fe4-cnibin\") pod \"multus-additional-cni-plugins-xj9h5\" (UID: \"c4bc690c-38e6-488d-97b2-9bf37a916fe4\") " pod="openshift-multus/multus-additional-cni-plugins-xj9h5" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.520741 4985 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.520769 4985 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/731349d2-7b07-4bc9-81f8-c7d75bca842a-hostroot\") pod \"multus-q24bf\" (UID: \"731349d2-7b07-4bc9-81f8-c7d75bca842a\") " pod="openshift-multus/multus-q24bf" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.520797 4985 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/1b3986ef-e9be-43db-9350-ccc7dd3f713f-ovnkube-script-lib\") pod \"ovnkube-node-27dpt\" (UID: \"1b3986ef-e9be-43db-9350-ccc7dd3f713f\") " pod="openshift-ovn-kubernetes/ovnkube-node-27dpt" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.520814 4985 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/731349d2-7b07-4bc9-81f8-c7d75bca842a-etc-kubernetes\") pod \"multus-q24bf\" (UID: \"731349d2-7b07-4bc9-81f8-c7d75bca842a\") " pod="openshift-multus/multus-q24bf" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.520831 4985 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/1b3986ef-e9be-43db-9350-ccc7dd3f713f-host-run-netns\") pod \"ovnkube-node-27dpt\" (UID: \"1b3986ef-e9be-43db-9350-ccc7dd3f713f\") " pod="openshift-ovn-kubernetes/ovnkube-node-27dpt" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.520858 4985 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/1b3986ef-e9be-43db-9350-ccc7dd3f713f-etc-openvswitch\") pod \"ovnkube-node-27dpt\" (UID: \"1b3986ef-e9be-43db-9350-ccc7dd3f713f\") " pod="openshift-ovn-kubernetes/ovnkube-node-27dpt" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.520864 4985 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/c3a34c00-910b-400b-bc96-9d805e076b7c-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-kkmm8\" (UID: \"c3a34c00-910b-400b-bc96-9d805e076b7c\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-kkmm8" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.520872 4985 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/c3a34c00-910b-400b-bc96-9d805e076b7c-env-overrides\") pod \"ovnkube-control-plane-749d76644c-kkmm8\" (UID: \"c3a34c00-910b-400b-bc96-9d805e076b7c\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-kkmm8" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.520908 4985 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/1b3986ef-e9be-43db-9350-ccc7dd3f713f-host-run-ovn-kubernetes\") pod \"ovnkube-node-27dpt\" (UID: \"1b3986ef-e9be-43db-9350-ccc7dd3f713f\") " pod="openshift-ovn-kubernetes/ovnkube-node-27dpt" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.520922 4985 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2zg8\" (UniqueName: \"kubernetes.io/projected/c3a34c00-910b-400b-bc96-9d805e076b7c-kube-api-access-s2zg8\") pod \"ovnkube-control-plane-749d76644c-kkmm8\" (UID: \"c3a34c00-910b-400b-bc96-9d805e076b7c\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-kkmm8" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.522399 4985 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/11c1c7b8-18df-4583-849f-76b62544344b-mcd-auth-proxy-config\") pod \"machine-config-daemon-hq52w\" (UID: \"11c1c7b8-18df-4583-849f-76b62544344b\") " pod="openshift-machine-config-operator/machine-config-daemon-hq52w" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.522481 4985 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/731349d2-7b07-4bc9-81f8-c7d75bca842a-host-run-netns\") pod \"multus-q24bf\" (UID: \"731349d2-7b07-4bc9-81f8-c7d75bca842a\") " pod="openshift-multus/multus-q24bf" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.522605 4985 reconciler_common.go:293] "Volume detached for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") on node \"crc\" DevicePath \"\"" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.522625 4985 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") on node \"crc\" DevicePath \"\"" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.522639 4985 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") on node \"crc\" DevicePath \"\"" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.523031 4985 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.523046 4985 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.523059 4985 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") on node \"crc\" DevicePath \"\"" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.523074 4985 reconciler_common.go:293] "Volume detached for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.523362 4985 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.522038 4985 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/1b3986ef-e9be-43db-9350-ccc7dd3f713f-host-run-netns\") pod \"ovnkube-node-27dpt\" (UID: \"1b3986ef-e9be-43db-9350-ccc7dd3f713f\") " pod="openshift-ovn-kubernetes/ovnkube-node-27dpt" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.521342 4985 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/731349d2-7b07-4bc9-81f8-c7d75bca842a-hostroot\") pod \"multus-q24bf\" (UID: \"731349d2-7b07-4bc9-81f8-c7d75bca842a\") " pod="openshift-multus/multus-q24bf" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.520942 4985 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/1b3986ef-e9be-43db-9350-ccc7dd3f713f-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-27dpt\" (UID: \"1b3986ef-e9be-43db-9350-ccc7dd3f713f\") " pod="openshift-ovn-kubernetes/ovnkube-node-27dpt" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.523086 4985 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.523626 4985 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") on node \"crc\" DevicePath \"\"" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.523642 4985 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.523655 4985 reconciler_common.go:293] "Volume detached for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") on node \"crc\" DevicePath \"\"" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.523692 4985 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.523707 4985 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.523724 4985 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.523848 4985 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.523864 4985 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.521686 4985 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/731349d2-7b07-4bc9-81f8-c7d75bca842a-cni-binary-copy\") pod \"multus-q24bf\" (UID: \"731349d2-7b07-4bc9-81f8-c7d75bca842a\") " pod="openshift-multus/multus-q24bf" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.521763 4985 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/731349d2-7b07-4bc9-81f8-c7d75bca842a-multus-daemon-config\") pod \"multus-q24bf\" (UID: \"731349d2-7b07-4bc9-81f8-c7d75bca842a\") " pod="openshift-multus/multus-q24bf" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.521792 4985 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/731349d2-7b07-4bc9-81f8-c7d75bca842a-host-run-k8s-cni-cncf-io\") pod \"multus-q24bf\" (UID: \"731349d2-7b07-4bc9-81f8-c7d75bca842a\") " pod="openshift-multus/multus-q24bf" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.523770 4985 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/731349d2-7b07-4bc9-81f8-c7d75bca842a-host-run-netns\") pod \"multus-q24bf\" (UID: \"731349d2-7b07-4bc9-81f8-c7d75bca842a\") " pod="openshift-multus/multus-q24bf" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.520969 4985 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/731349d2-7b07-4bc9-81f8-c7d75bca842a-host-var-lib-cni-bin\") pod \"multus-q24bf\" (UID: \"731349d2-7b07-4bc9-81f8-c7d75bca842a\") " pod="openshift-multus/multus-q24bf" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.523732 4985 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/c3a34c00-910b-400b-bc96-9d805e076b7c-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-kkmm8\" (UID: \"c3a34c00-910b-400b-bc96-9d805e076b7c\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-kkmm8" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.521812 4985 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/1b3986ef-e9be-43db-9350-ccc7dd3f713f-run-ovn\") pod \"ovnkube-node-27dpt\" (UID: \"1b3986ef-e9be-43db-9350-ccc7dd3f713f\") " pod="openshift-ovn-kubernetes/ovnkube-node-27dpt" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.522004 4985 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/c4bc690c-38e6-488d-97b2-9bf37a916fe4-cnibin\") pod \"multus-additional-cni-plugins-xj9h5\" (UID: \"c4bc690c-38e6-488d-97b2-9bf37a916fe4\") " pod="openshift-multus/multus-additional-cni-plugins-xj9h5" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.521035 4985 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/731349d2-7b07-4bc9-81f8-c7d75bca842a-os-release\") pod \"multus-q24bf\" (UID: \"731349d2-7b07-4bc9-81f8-c7d75bca842a\") " pod="openshift-multus/multus-q24bf" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.524035 4985 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/c4bc690c-38e6-488d-97b2-9bf37a916fe4-tuning-conf-dir\") pod \"multus-additional-cni-plugins-xj9h5\" (UID: \"c4bc690c-38e6-488d-97b2-9bf37a916fe4\") " pod="openshift-multus/multus-additional-cni-plugins-xj9h5" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.521781 4985 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/4776cd42-9a5c-4c2e-a585-a1f4a49d2d6d-serviceca\") pod \"node-ca-jj7jq\" (UID: \"4776cd42-9a5c-4c2e-a585-a1f4a49d2d6d\") " pod="openshift-image-registry/node-ca-jj7jq" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.524843 4985 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/1b3986ef-e9be-43db-9350-ccc7dd3f713f-ovn-node-metrics-cert\") pod \"ovnkube-node-27dpt\" (UID: \"1b3986ef-e9be-43db-9350-ccc7dd3f713f\") " pod="openshift-ovn-kubernetes/ovnkube-node-27dpt" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.525029 4985 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/11c1c7b8-18df-4583-849f-76b62544344b-proxy-tls\") pod \"machine-config-daemon-hq52w\" (UID: \"11c1c7b8-18df-4583-849f-76b62544344b\") " pod="openshift-machine-config-operator/machine-config-daemon-hq52w" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.525174 4985 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/1b3986ef-e9be-43db-9350-ccc7dd3f713f-env-overrides\") pod \"ovnkube-node-27dpt\" (UID: \"1b3986ef-e9be-43db-9350-ccc7dd3f713f\") " pod="openshift-ovn-kubernetes/ovnkube-node-27dpt" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.528193 4985 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/1b3986ef-e9be-43db-9350-ccc7dd3f713f-ovnkube-script-lib\") pod \"ovnkube-node-27dpt\" (UID: \"1b3986ef-e9be-43db-9350-ccc7dd3f713f\") " pod="openshift-ovn-kubernetes/ovnkube-node-27dpt" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.528527 4985 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/11c1c7b8-18df-4583-849f-76b62544344b-mcd-auth-proxy-config\") pod \"machine-config-daemon-hq52w\" (UID: \"11c1c7b8-18df-4583-849f-76b62544344b\") " pod="openshift-machine-config-operator/machine-config-daemon-hq52w" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.536610 4985 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5g9sh\" (UniqueName: \"kubernetes.io/projected/731349d2-7b07-4bc9-81f8-c7d75bca842a-kube-api-access-5g9sh\") pod \"multus-q24bf\" (UID: \"731349d2-7b07-4bc9-81f8-c7d75bca842a\") " pod="openshift-multus/multus-q24bf" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.537385 4985 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9n6c9\" (UniqueName: \"kubernetes.io/projected/1b3986ef-e9be-43db-9350-ccc7dd3f713f-kube-api-access-9n6c9\") pod \"ovnkube-node-27dpt\" (UID: \"1b3986ef-e9be-43db-9350-ccc7dd3f713f\") " pod="openshift-ovn-kubernetes/ovnkube-node-27dpt" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.538021 4985 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-svj5z\" (UniqueName: \"kubernetes.io/projected/c4bc690c-38e6-488d-97b2-9bf37a916fe4-kube-api-access-svj5z\") pod \"multus-additional-cni-plugins-xj9h5\" (UID: \"c4bc690c-38e6-488d-97b2-9bf37a916fe4\") " pod="openshift-multus/multus-additional-cni-plugins-xj9h5" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.541126 4985 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kcfbt\" (UniqueName: \"kubernetes.io/projected/11c1c7b8-18df-4583-849f-76b62544344b-kube-api-access-kcfbt\") pod \"machine-config-daemon-hq52w\" (UID: \"11c1c7b8-18df-4583-849f-76b62544344b\") " pod="openshift-machine-config-operator/machine-config-daemon-hq52w" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.542617 4985 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2zg8\" (UniqueName: \"kubernetes.io/projected/c3a34c00-910b-400b-bc96-9d805e076b7c-kube-api-access-s2zg8\") pod \"ovnkube-control-plane-749d76644c-kkmm8\" (UID: \"c3a34c00-910b-400b-bc96-9d805e076b7c\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-kkmm8" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.543414 4985 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rvmf8\" (UniqueName: \"kubernetes.io/projected/4776cd42-9a5c-4c2e-a585-a1f4a49d2d6d-kube-api-access-rvmf8\") pod \"node-ca-jj7jq\" (UID: \"4776cd42-9a5c-4c2e-a585-a1f4a49d2d6d\") " pod="openshift-image-registry/node-ca-jj7jq" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.543702 4985 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jdldr\" (UniqueName: \"kubernetes.io/projected/c43e3252-1b22-48a0-8895-ad98fc88b7cf-kube-api-access-jdldr\") pod \"node-resolver-qtqms\" (UID: \"c43e3252-1b22-48a0-8895-ad98fc88b7cf\") " pod="openshift-dns/node-resolver-qtqms" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.545648 4985 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lvcfx\" (UniqueName: \"kubernetes.io/projected/d4340d1a-60cb-4240-87ba-1e468c9c41cf-kube-api-access-lvcfx\") pod \"network-metrics-daemon-xkc65\" (UID: \"d4340d1a-60cb-4240-87ba-1e468c9c41cf\") " pod="openshift-multus/network-metrics-daemon-xkc65" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.560169 4985 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.571537 4985 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.580157 4985 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.586678 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.586721 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.586738 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.586763 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.586777 4985 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T10:10:13Z","lastTransitionTime":"2026-02-24T10:10:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.589957 4985 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-xj9h5" Feb 24 10:10:13 crc kubenswrapper[4985]: W0224 10:10:13.591671 4985 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podef543e1b_8068_4ea3_b32a_61027b32e95d.slice/crio-933abe24afd93569212d61dede6cf420dc7280c19ce65a92b42d8dd96e176043 WatchSource:0}: Error finding container 933abe24afd93569212d61dede6cf420dc7280c19ce65a92b42d8dd96e176043: Status 404 returned error can't find the container with id 933abe24afd93569212d61dede6cf420dc7280c19ce65a92b42d8dd96e176043 Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.596428 4985 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-q24bf" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.603757 4985 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-27dpt" Feb 24 10:10:13 crc kubenswrapper[4985]: W0224 10:10:13.605511 4985 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd75a4c96_2883_4a0b_bab2_0fab2b6c0b49.slice/crio-31a7fcba4773263eb1e6cc295c1d167aaac25a2df349b707a23bc031bb0443cd WatchSource:0}: Error finding container 31a7fcba4773263eb1e6cc295c1d167aaac25a2df349b707a23bc031bb0443cd: Status 404 returned error can't find the container with id 31a7fcba4773263eb1e6cc295c1d167aaac25a2df349b707a23bc031bb0443cd Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.610341 4985 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-kkmm8" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.617110 4985 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-qtqms" Feb 24 10:10:13 crc kubenswrapper[4985]: W0224 10:10:13.617807 4985 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc4bc690c_38e6_488d_97b2_9bf37a916fe4.slice/crio-7a1afe95c8b202a5a6c3bd93b8ec63f80c82457c60f90a490619264823bd1a58 WatchSource:0}: Error finding container 7a1afe95c8b202a5a6c3bd93b8ec63f80c82457c60f90a490619264823bd1a58: Status 404 returned error can't find the container with id 7a1afe95c8b202a5a6c3bd93b8ec63f80c82457c60f90a490619264823bd1a58 Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.623740 4985 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-hq52w" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.629187 4985 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-jj7jq" Feb 24 10:10:13 crc kubenswrapper[4985]: W0224 10:10:13.629674 4985 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod731349d2_7b07_4bc9_81f8_c7d75bca842a.slice/crio-0092ff32e7249299753b54b0bcb79130650c00c943793244051b24144a14f65c WatchSource:0}: Error finding container 0092ff32e7249299753b54b0bcb79130650c00c943793244051b24144a14f65c: Status 404 returned error can't find the container with id 0092ff32e7249299753b54b0bcb79130650c00c943793244051b24144a14f65c Feb 24 10:10:13 crc kubenswrapper[4985]: W0224 10:10:13.639497 4985 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1b3986ef_e9be_43db_9350_ccc7dd3f713f.slice/crio-4eeee5c90be69f13ae3c1ebb0228649f43f2c4c7a73100d1c9ccefe388fd4555 WatchSource:0}: Error finding container 4eeee5c90be69f13ae3c1ebb0228649f43f2c4c7a73100d1c9ccefe388fd4555: Status 404 returned error can't find the container with id 4eeee5c90be69f13ae3c1ebb0228649f43f2c4c7a73100d1c9ccefe388fd4555 Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.643152 4985 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-q24bf" event={"ID":"731349d2-7b07-4bc9-81f8-c7d75bca842a","Type":"ContainerStarted","Data":"0092ff32e7249299753b54b0bcb79130650c00c943793244051b24144a14f65c"} Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.644197 4985 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-xj9h5" event={"ID":"c4bc690c-38e6-488d-97b2-9bf37a916fe4","Type":"ContainerStarted","Data":"7a1afe95c8b202a5a6c3bd93b8ec63f80c82457c60f90a490619264823bd1a58"} Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.645524 4985 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"933abe24afd93569212d61dede6cf420dc7280c19ce65a92b42d8dd96e176043"} Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.647025 4985 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"31a7fcba4773263eb1e6cc295c1d167aaac25a2df349b707a23bc031bb0443cd"} Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.648121 4985 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"00ad55da7c73e5185d4c5f408343a0106393ad30c60697f4591b597fbb1e44eb"} Feb 24 10:10:13 crc kubenswrapper[4985]: W0224 10:10:13.653106 4985 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc3a34c00_910b_400b_bc96_9d805e076b7c.slice/crio-f3b03001d537c3e090c6dc964668a551e23bccc5f45769572280fcb8ebd47aee WatchSource:0}: Error finding container f3b03001d537c3e090c6dc964668a551e23bccc5f45769572280fcb8ebd47aee: Status 404 returned error can't find the container with id f3b03001d537c3e090c6dc964668a551e23bccc5f45769572280fcb8ebd47aee Feb 24 10:10:13 crc kubenswrapper[4985]: W0224 10:10:13.655507 4985 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc43e3252_1b22_48a0_8895_ad98fc88b7cf.slice/crio-b859c2a48d649aec05ca4a7b1cc1872beb817821e23192f94198288eb57f81bc WatchSource:0}: Error finding container b859c2a48d649aec05ca4a7b1cc1872beb817821e23192f94198288eb57f81bc: Status 404 returned error can't find the container with id b859c2a48d649aec05ca4a7b1cc1872beb817821e23192f94198288eb57f81bc Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.688976 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.689024 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.689044 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.689063 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.689078 4985 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T10:10:13Z","lastTransitionTime":"2026-02-24T10:10:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 10:10:13 crc kubenswrapper[4985]: W0224 10:10:13.691770 4985 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4776cd42_9a5c_4c2e_a585_a1f4a49d2d6d.slice/crio-d12882ad66a91962a95af9533f5bedd33c87ca51a16c6004e5b7fe6f4af821e9 WatchSource:0}: Error finding container d12882ad66a91962a95af9533f5bedd33c87ca51a16c6004e5b7fe6f4af821e9: Status 404 returned error can't find the container with id d12882ad66a91962a95af9533f5bedd33c87ca51a16c6004e5b7fe6f4af821e9 Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.795720 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.796069 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.796082 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.796098 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.796138 4985 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T10:10:13Z","lastTransitionTime":"2026-02-24T10:10:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.899317 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.899346 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.899354 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.899370 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.899379 4985 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T10:10:13Z","lastTransitionTime":"2026-02-24T10:10:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.927410 4985 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 24 10:10:13 crc kubenswrapper[4985]: E0224 10:10:13.927602 4985 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-24 10:10:14.927571234 +0000 UTC m=+99.401763794 (durationBeforeRetry 1s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.927716 4985 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.927755 4985 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 24 10:10:13 crc kubenswrapper[4985]: I0224 10:10:13.927789 4985 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 24 10:10:13 crc kubenswrapper[4985]: E0224 10:10:13.927934 4985 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Feb 24 10:10:13 crc kubenswrapper[4985]: E0224 10:10:13.927978 4985 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-24 10:10:14.927968765 +0000 UTC m=+99.402161385 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Feb 24 10:10:13 crc kubenswrapper[4985]: E0224 10:10:13.928284 4985 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 24 10:10:13 crc kubenswrapper[4985]: E0224 10:10:13.928467 4985 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-24 10:10:14.928430338 +0000 UTC m=+99.402622908 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 24 10:10:13 crc kubenswrapper[4985]: E0224 10:10:13.928302 4985 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 24 10:10:13 crc kubenswrapper[4985]: E0224 10:10:13.928654 4985 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 24 10:10:13 crc kubenswrapper[4985]: E0224 10:10:13.928739 4985 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 24 10:10:13 crc kubenswrapper[4985]: E0224 10:10:13.928872 4985 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-02-24 10:10:14.928858761 +0000 UTC m=+99.403051331 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 24 10:10:14 crc kubenswrapper[4985]: I0224 10:10:14.001978 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:10:14 crc kubenswrapper[4985]: I0224 10:10:14.002030 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:10:14 crc kubenswrapper[4985]: I0224 10:10:14.002040 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:10:14 crc kubenswrapper[4985]: I0224 10:10:14.002057 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 10:10:14 crc kubenswrapper[4985]: I0224 10:10:14.002066 4985 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T10:10:14Z","lastTransitionTime":"2026-02-24T10:10:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 10:10:14 crc kubenswrapper[4985]: I0224 10:10:14.029385 4985 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d4340d1a-60cb-4240-87ba-1e468c9c41cf-metrics-certs\") pod \"network-metrics-daemon-xkc65\" (UID: \"d4340d1a-60cb-4240-87ba-1e468c9c41cf\") " pod="openshift-multus/network-metrics-daemon-xkc65" Feb 24 10:10:14 crc kubenswrapper[4985]: I0224 10:10:14.029471 4985 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 24 10:10:14 crc kubenswrapper[4985]: E0224 10:10:14.029619 4985 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 24 10:10:14 crc kubenswrapper[4985]: E0224 10:10:14.029640 4985 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 24 10:10:14 crc kubenswrapper[4985]: E0224 10:10:14.029654 4985 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 24 10:10:14 crc kubenswrapper[4985]: E0224 10:10:14.029622 4985 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Feb 24 10:10:14 crc kubenswrapper[4985]: E0224 10:10:14.029710 4985 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-02-24 10:10:15.029691462 +0000 UTC m=+99.503884022 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 24 10:10:14 crc kubenswrapper[4985]: E0224 10:10:14.029784 4985 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d4340d1a-60cb-4240-87ba-1e468c9c41cf-metrics-certs podName:d4340d1a-60cb-4240-87ba-1e468c9c41cf nodeName:}" failed. No retries permitted until 2026-02-24 10:10:15.029750374 +0000 UTC m=+99.503942934 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/d4340d1a-60cb-4240-87ba-1e468c9c41cf-metrics-certs") pod "network-metrics-daemon-xkc65" (UID: "d4340d1a-60cb-4240-87ba-1e468c9c41cf") : object "openshift-multus"/"metrics-daemon-secret" not registered Feb 24 10:10:14 crc kubenswrapper[4985]: I0224 10:10:14.104538 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:10:14 crc kubenswrapper[4985]: I0224 10:10:14.104607 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:10:14 crc kubenswrapper[4985]: I0224 10:10:14.104619 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:10:14 crc kubenswrapper[4985]: I0224 10:10:14.104641 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 10:10:14 crc kubenswrapper[4985]: I0224 10:10:14.104670 4985 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T10:10:14Z","lastTransitionTime":"2026-02-24T10:10:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 10:10:14 crc kubenswrapper[4985]: I0224 10:10:14.207484 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:10:14 crc kubenswrapper[4985]: I0224 10:10:14.207537 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:10:14 crc kubenswrapper[4985]: I0224 10:10:14.207549 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:10:14 crc kubenswrapper[4985]: I0224 10:10:14.207574 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 10:10:14 crc kubenswrapper[4985]: I0224 10:10:14.207586 4985 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T10:10:14Z","lastTransitionTime":"2026-02-24T10:10:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 10:10:14 crc kubenswrapper[4985]: I0224 10:10:14.267902 4985 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="01ab3dd5-8196-46d0-ad33-122e2ca51def" path="/var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes" Feb 24 10:10:14 crc kubenswrapper[4985]: I0224 10:10:14.268497 4985 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" path="/var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes" Feb 24 10:10:14 crc kubenswrapper[4985]: I0224 10:10:14.269788 4985 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09efc573-dbb6-4249-bd59-9b87aba8dd28" path="/var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes" Feb 24 10:10:14 crc kubenswrapper[4985]: I0224 10:10:14.270503 4985 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b574797-001e-440a-8f4e-c0be86edad0f" path="/var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes" Feb 24 10:10:14 crc kubenswrapper[4985]: I0224 10:10:14.271423 4985 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b78653f-4ff9-4508-8672-245ed9b561e3" path="/var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes" Feb 24 10:10:14 crc kubenswrapper[4985]: I0224 10:10:14.271950 4985 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1386a44e-36a2-460c-96d0-0359d2b6f0f5" path="/var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes" Feb 24 10:10:14 crc kubenswrapper[4985]: I0224 10:10:14.272510 4985 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1bf7eb37-55a3-4c65-b768-a94c82151e69" path="/var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes" Feb 24 10:10:14 crc kubenswrapper[4985]: I0224 10:10:14.273597 4985 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1d611f23-29be-4491-8495-bee1670e935f" path="/var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes" Feb 24 10:10:14 crc kubenswrapper[4985]: I0224 10:10:14.274280 4985 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="20b0d48f-5fd6-431c-a545-e3c800c7b866" path="/var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/volumes" Feb 24 10:10:14 crc kubenswrapper[4985]: I0224 10:10:14.275304 4985 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" path="/var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes" Feb 24 10:10:14 crc kubenswrapper[4985]: I0224 10:10:14.275906 4985 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="22c825df-677d-4ca6-82db-3454ed06e783" path="/var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes" Feb 24 10:10:14 crc kubenswrapper[4985]: I0224 10:10:14.276956 4985 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="25e176fe-21b4-4974-b1ed-c8b94f112a7f" path="/var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes" Feb 24 10:10:14 crc kubenswrapper[4985]: I0224 10:10:14.277475 4985 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" path="/var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes" Feb 24 10:10:14 crc kubenswrapper[4985]: I0224 10:10:14.278131 4985 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="31d8b7a1-420e-4252-a5b7-eebe8a111292" path="/var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes" Feb 24 10:10:14 crc kubenswrapper[4985]: I0224 10:10:14.279090 4985 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3ab1a177-2de0-46d9-b765-d0d0649bb42e" path="/var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/volumes" Feb 24 10:10:14 crc kubenswrapper[4985]: I0224 10:10:14.279581 4985 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" path="/var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes" Feb 24 10:10:14 crc kubenswrapper[4985]: I0224 10:10:14.280526 4985 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="43509403-f426-496e-be36-56cef71462f5" path="/var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes" Feb 24 10:10:14 crc kubenswrapper[4985]: I0224 10:10:14.280906 4985 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="44663579-783b-4372-86d6-acf235a62d72" path="/var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/volumes" Feb 24 10:10:14 crc kubenswrapper[4985]: I0224 10:10:14.281543 4985 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="496e6271-fb68-4057-954e-a0d97a4afa3f" path="/var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes" Feb 24 10:10:14 crc kubenswrapper[4985]: I0224 10:10:14.282532 4985 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" path="/var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes" Feb 24 10:10:14 crc kubenswrapper[4985]: I0224 10:10:14.283013 4985 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49ef4625-1d3a-4a9f-b595-c2433d32326d" path="/var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/volumes" Feb 24 10:10:14 crc kubenswrapper[4985]: I0224 10:10:14.284105 4985 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4bb40260-dbaa-4fb0-84df-5e680505d512" path="/var/lib/kubelet/pods/4bb40260-dbaa-4fb0-84df-5e680505d512/volumes" Feb 24 10:10:14 crc kubenswrapper[4985]: I0224 10:10:14.284523 4985 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5225d0e4-402f-4861-b410-819f433b1803" path="/var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes" Feb 24 10:10:14 crc kubenswrapper[4985]: I0224 10:10:14.285508 4985 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5441d097-087c-4d9a-baa8-b210afa90fc9" path="/var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes" Feb 24 10:10:14 crc kubenswrapper[4985]: I0224 10:10:14.285970 4985 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="57a731c4-ef35-47a8-b875-bfb08a7f8011" path="/var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes" Feb 24 10:10:14 crc kubenswrapper[4985]: I0224 10:10:14.286531 4985 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5b88f790-22fa-440e-b583-365168c0b23d" path="/var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/volumes" Feb 24 10:10:14 crc kubenswrapper[4985]: I0224 10:10:14.287537 4985 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5fe579f8-e8a6-4643-bce5-a661393c4dde" path="/var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/volumes" Feb 24 10:10:14 crc kubenswrapper[4985]: I0224 10:10:14.288100 4985 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6402fda4-df10-493c-b4e5-d0569419652d" path="/var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes" Feb 24 10:10:14 crc kubenswrapper[4985]: I0224 10:10:14.289056 4985 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6509e943-70c6-444c-bc41-48a544e36fbd" path="/var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes" Feb 24 10:10:14 crc kubenswrapper[4985]: I0224 10:10:14.289578 4985 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6731426b-95fe-49ff-bb5f-40441049fde2" path="/var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/volumes" Feb 24 10:10:14 crc kubenswrapper[4985]: I0224 10:10:14.290469 4985 kubelet_volumes.go:152] "Cleaned up orphaned volume subpath from pod" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volume-subpaths/run-systemd/ovnkube-controller/6" Feb 24 10:10:14 crc kubenswrapper[4985]: I0224 10:10:14.290601 4985 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volumes" Feb 24 10:10:14 crc kubenswrapper[4985]: I0224 10:10:14.292408 4985 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7539238d-5fe0-46ed-884e-1c3b566537ec" path="/var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes" Feb 24 10:10:14 crc kubenswrapper[4985]: I0224 10:10:14.293370 4985 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7583ce53-e0fe-4a16-9e4d-50516596a136" path="/var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes" Feb 24 10:10:14 crc kubenswrapper[4985]: I0224 10:10:14.293881 4985 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7bb08738-c794-4ee8-9972-3a62ca171029" path="/var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes" Feb 24 10:10:14 crc kubenswrapper[4985]: I0224 10:10:14.295481 4985 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="87cf06ed-a83f-41a7-828d-70653580a8cb" path="/var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes" Feb 24 10:10:14 crc kubenswrapper[4985]: I0224 10:10:14.296210 4985 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" path="/var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes" Feb 24 10:10:14 crc kubenswrapper[4985]: I0224 10:10:14.297164 4985 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="925f1c65-6136-48ba-85aa-3a3b50560753" path="/var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes" Feb 24 10:10:14 crc kubenswrapper[4985]: I0224 10:10:14.297854 4985 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" path="/var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/volumes" Feb 24 10:10:14 crc kubenswrapper[4985]: I0224 10:10:14.298972 4985 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9d4552c7-cd75-42dd-8880-30dd377c49a4" path="/var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes" Feb 24 10:10:14 crc kubenswrapper[4985]: I0224 10:10:14.299439 4985 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" path="/var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/volumes" Feb 24 10:10:14 crc kubenswrapper[4985]: I0224 10:10:14.300540 4985 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a31745f5-9847-4afe-82a5-3161cc66ca93" path="/var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes" Feb 24 10:10:14 crc kubenswrapper[4985]: I0224 10:10:14.301241 4985 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" path="/var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes" Feb 24 10:10:14 crc kubenswrapper[4985]: I0224 10:10:14.302286 4985 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6312bbd-5731-4ea0-a20f-81d5a57df44a" path="/var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/volumes" Feb 24 10:10:14 crc kubenswrapper[4985]: I0224 10:10:14.302759 4985 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" path="/var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes" Feb 24 10:10:14 crc kubenswrapper[4985]: I0224 10:10:14.303646 4985 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" path="/var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes" Feb 24 10:10:14 crc kubenswrapper[4985]: I0224 10:10:14.304368 4985 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" path="/var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/volumes" Feb 24 10:10:14 crc kubenswrapper[4985]: I0224 10:10:14.305463 4985 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bf126b07-da06-4140-9a57-dfd54fc6b486" path="/var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes" Feb 24 10:10:14 crc kubenswrapper[4985]: I0224 10:10:14.305957 4985 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c03ee662-fb2f-4fc4-a2c1-af487c19d254" path="/var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes" Feb 24 10:10:14 crc kubenswrapper[4985]: I0224 10:10:14.306792 4985 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" path="/var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/volumes" Feb 24 10:10:14 crc kubenswrapper[4985]: I0224 10:10:14.307302 4985 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e7e6199b-1264-4501-8953-767f51328d08" path="/var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes" Feb 24 10:10:14 crc kubenswrapper[4985]: I0224 10:10:14.308406 4985 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="efdd0498-1daa-4136-9a4a-3b948c2293fc" path="/var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/volumes" Feb 24 10:10:14 crc kubenswrapper[4985]: I0224 10:10:14.309057 4985 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" path="/var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/volumes" Feb 24 10:10:14 crc kubenswrapper[4985]: I0224 10:10:14.309630 4985 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fda69060-fa79-4696-b1a6-7980f124bf7c" path="/var/lib/kubelet/pods/fda69060-fa79-4696-b1a6-7980f124bf7c/volumes" Feb 24 10:10:14 crc kubenswrapper[4985]: I0224 10:10:14.309934 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:10:14 crc kubenswrapper[4985]: I0224 10:10:14.309990 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:10:14 crc kubenswrapper[4985]: I0224 10:10:14.310003 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:10:14 crc kubenswrapper[4985]: I0224 10:10:14.310023 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 10:10:14 crc kubenswrapper[4985]: I0224 10:10:14.310037 4985 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T10:10:14Z","lastTransitionTime":"2026-02-24T10:10:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 10:10:14 crc kubenswrapper[4985]: I0224 10:10:14.372427 4985 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Feb 24 10:10:14 crc kubenswrapper[4985]: I0224 10:10:14.372644 4985 scope.go:117] "RemoveContainer" containerID="6ee8b871ab25b79ee67f4e5735673469d19cfb336696170957d2439b0bca13af" Feb 24 10:10:14 crc kubenswrapper[4985]: I0224 10:10:14.412775 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:10:14 crc kubenswrapper[4985]: I0224 10:10:14.412814 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:10:14 crc kubenswrapper[4985]: I0224 10:10:14.412825 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:10:14 crc kubenswrapper[4985]: I0224 10:10:14.412840 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 10:10:14 crc kubenswrapper[4985]: I0224 10:10:14.412850 4985 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T10:10:14Z","lastTransitionTime":"2026-02-24T10:10:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 10:10:14 crc kubenswrapper[4985]: I0224 10:10:14.515220 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:10:14 crc kubenswrapper[4985]: I0224 10:10:14.515497 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:10:14 crc kubenswrapper[4985]: I0224 10:10:14.515507 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:10:14 crc kubenswrapper[4985]: I0224 10:10:14.515521 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 10:10:14 crc kubenswrapper[4985]: I0224 10:10:14.515534 4985 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T10:10:14Z","lastTransitionTime":"2026-02-24T10:10:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 10:10:14 crc kubenswrapper[4985]: I0224 10:10:14.617413 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:10:14 crc kubenswrapper[4985]: I0224 10:10:14.617445 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:10:14 crc kubenswrapper[4985]: I0224 10:10:14.617453 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:10:14 crc kubenswrapper[4985]: I0224 10:10:14.617466 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 10:10:14 crc kubenswrapper[4985]: I0224 10:10:14.617476 4985 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T10:10:14Z","lastTransitionTime":"2026-02-24T10:10:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 10:10:14 crc kubenswrapper[4985]: I0224 10:10:14.652693 4985 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-qtqms" event={"ID":"c43e3252-1b22-48a0-8895-ad98fc88b7cf","Type":"ContainerStarted","Data":"778506fd4d9af7b6ebd4e4d46282938146ba0184d9747ca657957c78118c10e2"} Feb 24 10:10:14 crc kubenswrapper[4985]: I0224 10:10:14.652781 4985 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-qtqms" event={"ID":"c43e3252-1b22-48a0-8895-ad98fc88b7cf","Type":"ContainerStarted","Data":"b859c2a48d649aec05ca4a7b1cc1872beb817821e23192f94198288eb57f81bc"} Feb 24 10:10:14 crc kubenswrapper[4985]: I0224 10:10:14.654217 4985 generic.go:334] "Generic (PLEG): container finished" podID="c4bc690c-38e6-488d-97b2-9bf37a916fe4" containerID="5aa381e3ea366641ef4fa423fdd46f8afb44b1b2a2ccb6c754007456149b75e6" exitCode=0 Feb 24 10:10:14 crc kubenswrapper[4985]: I0224 10:10:14.654277 4985 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-xj9h5" event={"ID":"c4bc690c-38e6-488d-97b2-9bf37a916fe4","Type":"ContainerDied","Data":"5aa381e3ea366641ef4fa423fdd46f8afb44b1b2a2ccb6c754007456149b75e6"} Feb 24 10:10:14 crc kubenswrapper[4985]: I0224 10:10:14.656551 4985 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-jj7jq" event={"ID":"4776cd42-9a5c-4c2e-a585-a1f4a49d2d6d","Type":"ContainerStarted","Data":"6f45d6400640ba9b59793abc9aee79bb1c32439ff377abdf8b4ec4d1cd7c336a"} Feb 24 10:10:14 crc kubenswrapper[4985]: I0224 10:10:14.656581 4985 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-jj7jq" event={"ID":"4776cd42-9a5c-4c2e-a585-a1f4a49d2d6d","Type":"ContainerStarted","Data":"d12882ad66a91962a95af9533f5bedd33c87ca51a16c6004e5b7fe6f4af821e9"} Feb 24 10:10:14 crc kubenswrapper[4985]: I0224 10:10:14.664532 4985 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-hq52w" event={"ID":"11c1c7b8-18df-4583-849f-76b62544344b","Type":"ContainerStarted","Data":"8546cf975b145528b7f94bd03c6570de0db1a059477da5795b7569250bde3ca5"} Feb 24 10:10:14 crc kubenswrapper[4985]: I0224 10:10:14.664598 4985 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-hq52w" event={"ID":"11c1c7b8-18df-4583-849f-76b62544344b","Type":"ContainerStarted","Data":"14cb7316040a9cb3cb3236ed16ecc17eee99c6935a646b86604ea8285e1aab0b"} Feb 24 10:10:14 crc kubenswrapper[4985]: I0224 10:10:14.664619 4985 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-hq52w" event={"ID":"11c1c7b8-18df-4583-849f-76b62544344b","Type":"ContainerStarted","Data":"22659ff5967394ff05fdd0da8a98415b45b65458bf97daba46f19186c2acacf6"} Feb 24 10:10:14 crc kubenswrapper[4985]: I0224 10:10:14.667943 4985 generic.go:334] "Generic (PLEG): container finished" podID="1b3986ef-e9be-43db-9350-ccc7dd3f713f" containerID="8de70126733b3634f4e3c1576c3095c089e9628a6147ce8ea79fa5242a2eb920" exitCode=0 Feb 24 10:10:14 crc kubenswrapper[4985]: I0224 10:10:14.668049 4985 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-27dpt" event={"ID":"1b3986ef-e9be-43db-9350-ccc7dd3f713f","Type":"ContainerDied","Data":"8de70126733b3634f4e3c1576c3095c089e9628a6147ce8ea79fa5242a2eb920"} Feb 24 10:10:14 crc kubenswrapper[4985]: I0224 10:10:14.668116 4985 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-27dpt" event={"ID":"1b3986ef-e9be-43db-9350-ccc7dd3f713f","Type":"ContainerStarted","Data":"4eeee5c90be69f13ae3c1ebb0228649f43f2c4c7a73100d1c9ccefe388fd4555"} Feb 24 10:10:14 crc kubenswrapper[4985]: I0224 10:10:14.670084 4985 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/3.log" Feb 24 10:10:14 crc kubenswrapper[4985]: I0224 10:10:14.672755 4985 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:10:14Z is after 2025-08-24T17:21:41Z" Feb 24 10:10:14 crc kubenswrapper[4985]: I0224 10:10:14.674617 4985 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"dea2225b9a0112117f2f0bd6038ef636ac20e0cfcf3d608b720ad7aa2cacfa7c"} Feb 24 10:10:14 crc kubenswrapper[4985]: I0224 10:10:14.674973 4985 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 24 10:10:14 crc kubenswrapper[4985]: I0224 10:10:14.676421 4985 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-q24bf" event={"ID":"731349d2-7b07-4bc9-81f8-c7d75bca842a","Type":"ContainerStarted","Data":"85e965ad144b3440864b3c9de70aacd2a2e49c715dca204b0e4cfae65954cd9e"} Feb 24 10:10:14 crc kubenswrapper[4985]: I0224 10:10:14.681946 4985 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"137c9b42aa2109e16d0e273e3e7d116db30ed716c1e6f78fb8f4e87409e3f252"} Feb 24 10:10:14 crc kubenswrapper[4985]: I0224 10:10:14.681985 4985 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"40736fa6ee59064a7fc5bed9b08671d094e7a11b6a8bf4ec45220a0d62156542"} Feb 24 10:10:14 crc kubenswrapper[4985]: I0224 10:10:14.684581 4985 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-kkmm8" event={"ID":"c3a34c00-910b-400b-bc96-9d805e076b7c","Type":"ContainerStarted","Data":"52f2dbaf1dfa5fd40bbf8a0d82613a96b05afccc71653f6d1f24db6674950b58"} Feb 24 10:10:14 crc kubenswrapper[4985]: I0224 10:10:14.684616 4985 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-kkmm8" event={"ID":"c3a34c00-910b-400b-bc96-9d805e076b7c","Type":"ContainerStarted","Data":"65682d7b02ccbc0c344ab0b8eaf55d227d0b6e4a04e56d1038d51aa30747b068"} Feb 24 10:10:14 crc kubenswrapper[4985]: I0224 10:10:14.684631 4985 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-kkmm8" event={"ID":"c3a34c00-910b-400b-bc96-9d805e076b7c","Type":"ContainerStarted","Data":"f3b03001d537c3e090c6dc964668a551e23bccc5f45769572280fcb8ebd47aee"} Feb 24 10:10:14 crc kubenswrapper[4985]: I0224 10:10:14.686557 4985 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"c9e0873cd1521fe3cd14005f22118db4616043670f4b895435e71ddec9d82a0e"} Feb 24 10:10:14 crc kubenswrapper[4985]: I0224 10:10:14.691078 4985 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:10:14Z is after 2025-08-24T17:21:41Z" Feb 24 10:10:14 crc kubenswrapper[4985]: I0224 10:10:14.707270 4985 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:10:14Z is after 2025-08-24T17:21:41Z" Feb 24 10:10:14 crc kubenswrapper[4985]: I0224 10:10:14.722277 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:10:14 crc kubenswrapper[4985]: I0224 10:10:14.722319 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:10:14 crc kubenswrapper[4985]: I0224 10:10:14.722331 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:10:14 crc kubenswrapper[4985]: I0224 10:10:14.722349 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 10:10:14 crc kubenswrapper[4985]: I0224 10:10:14.722363 4985 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T10:10:14Z","lastTransitionTime":"2026-02-24T10:10:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 10:10:14 crc kubenswrapper[4985]: I0224 10:10:14.723032 4985 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-xkc65" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d4340d1a-60cb-4240-87ba-1e468c9c41cf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lvcfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lvcfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:10:13Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-xkc65\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:10:14Z is after 2025-08-24T17:21:41Z" Feb 24 10:10:14 crc kubenswrapper[4985]: I0224 10:10:14.750368 4985 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-27dpt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1b3986ef-e9be-43db-9350-ccc7dd3f713f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9n6c9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9n6c9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9n6c9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9n6c9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9n6c9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9n6c9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9n6c9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9n6c9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9n6c9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:10:13Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-27dpt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:10:14Z is after 2025-08-24T17:21:41Z" Feb 24 10:10:14 crc kubenswrapper[4985]: I0224 10:10:14.767247 4985 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-qtqms" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c43e3252-1b22-48a0-8895-ad98fc88b7cf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://778506fd4d9af7b6ebd4e4d46282938146ba0184d9747ca657957c78118c10e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:10:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jdldr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:10:13Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-qtqms\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:10:14Z is after 2025-08-24T17:21:41Z" Feb 24 10:10:14 crc kubenswrapper[4985]: I0224 10:10:14.794231 4985 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:10:14Z is after 2025-08-24T17:21:41Z" Feb 24 10:10:14 crc kubenswrapper[4985]: I0224 10:10:14.809430 4985 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-kkmm8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c3a34c00-910b-400b-bc96-9d805e076b7c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2zg8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2zg8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:10:13Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-kkmm8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:10:14Z is after 2025-08-24T17:21:41Z" Feb 24 10:10:14 crc kubenswrapper[4985]: I0224 10:10:14.825531 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:10:14 crc kubenswrapper[4985]: I0224 10:10:14.825611 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:10:14 crc kubenswrapper[4985]: I0224 10:10:14.825626 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:10:14 crc kubenswrapper[4985]: I0224 10:10:14.825644 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 10:10:14 crc kubenswrapper[4985]: I0224 10:10:14.825657 4985 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T10:10:14Z","lastTransitionTime":"2026-02-24T10:10:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 10:10:14 crc kubenswrapper[4985]: I0224 10:10:14.828448 4985 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:10:14Z is after 2025-08-24T17:21:41Z" Feb 24 10:10:14 crc kubenswrapper[4985]: I0224 10:10:14.847814 4985 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:10:14Z is after 2025-08-24T17:21:41Z" Feb 24 10:10:14 crc kubenswrapper[4985]: I0224 10:10:14.861785 4985 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-q24bf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"731349d2-7b07-4bc9-81f8-c7d75bca842a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5g9sh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:10:13Z\\\"}}\" for pod \"openshift-multus\"/\"multus-q24bf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:10:14Z is after 2025-08-24T17:21:41Z" Feb 24 10:10:14 crc kubenswrapper[4985]: I0224 10:10:14.876274 4985 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a6395bdd-0d8e-4572-ae86-695e87aff12e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:08:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:08:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:08:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:08:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:08:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f9134e7cde3d6c701f928934ed4de91ee13bac8bc62dde5df6fe535a219fb57\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:08:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba53e3e1dfd92b0155e60254f21e53e06da9929560934d7bd95624f4a2bb619c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:08:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d059ea80117284da8c7eb82e4cd878eabba8f9d96102965708b9f6e0f8e96eb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:08:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ee8b871ab25b79ee67f4e5735673469d19cfb336696170957d2439b0bca13af\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6ee8b871ab25b79ee67f4e5735673469d19cfb336696170957d2439b0bca13af\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-24T10:09:32Z\\\",\\\"message\\\":\\\"le observer\\\\nW0224 10:09:31.800132 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0224 10:09:31.800281 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0224 10:09:31.801147 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1923306337/tls.crt::/tmp/serving-cert-1923306337/tls.key\\\\\\\"\\\\nI0224 10:09:31.990905 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0224 10:09:31.993717 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0224 10:09:31.993747 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0224 10:09:31.993768 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0224 10:09:31.993775 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0224 10:09:31.997392 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0224 10:09:31.997421 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0224 10:09:31.997427 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0224 10:09:31.997436 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nI0224 10:09:31.997435 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0224 10:09:31.997440 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0224 10:09:31.997467 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0224 10:09:31.997470 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0224 10:09:31.999573 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-24T10:09:31Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5cf6a4886c046added0a4380f1ca023def94c15e9d5c0ed472c714632684f158\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:08:39Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8da8e0d5478ae714a89c61cf4ca4142bc433bca4d4e4749f069c706abfed8c8f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8da8e0d5478ae714a89c61cf4ca4142bc433bca4d4e4749f069c706abfed8c8f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T10:08:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T10:08:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:08:36Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:10:14Z is after 2025-08-24T17:21:41Z" Feb 24 10:10:14 crc kubenswrapper[4985]: I0224 10:10:14.894795 4985 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-xj9h5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c4bc690c-38e6-488d-97b2-9bf37a916fe4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svj5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svj5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svj5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svj5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svj5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svj5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svj5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:10:13Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-xj9h5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:10:14Z is after 2025-08-24T17:21:41Z" Feb 24 10:10:14 crc kubenswrapper[4985]: I0224 10:10:14.908073 4985 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-hq52w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"11c1c7b8-18df-4583-849f-76b62544344b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kcfbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kcfbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:10:13Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-hq52w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:10:14Z is after 2025-08-24T17:21:41Z" Feb 24 10:10:14 crc kubenswrapper[4985]: I0224 10:10:14.918091 4985 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-jj7jq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4776cd42-9a5c-4c2e-a585-a1f4a49d2d6d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rvmf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:10:13Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-jj7jq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:10:14Z is after 2025-08-24T17:21:41Z" Feb 24 10:10:14 crc kubenswrapper[4985]: I0224 10:10:14.931742 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:10:14 crc kubenswrapper[4985]: I0224 10:10:14.931782 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:10:14 crc kubenswrapper[4985]: I0224 10:10:14.931795 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:10:14 crc kubenswrapper[4985]: I0224 10:10:14.931814 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 10:10:14 crc kubenswrapper[4985]: I0224 10:10:14.931827 4985 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T10:10:14Z","lastTransitionTime":"2026-02-24T10:10:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 10:10:14 crc kubenswrapper[4985]: I0224 10:10:14.936293 4985 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a6395bdd-0d8e-4572-ae86-695e87aff12e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:08:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:08:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:08:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:08:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:08:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f9134e7cde3d6c701f928934ed4de91ee13bac8bc62dde5df6fe535a219fb57\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:08:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba53e3e1dfd92b0155e60254f21e53e06da9929560934d7bd95624f4a2bb619c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:08:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d059ea80117284da8c7eb82e4cd878eabba8f9d96102965708b9f6e0f8e96eb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:08:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dea2225b9a0112117f2f0bd6038ef636ac20e0cfcf3d608b720ad7aa2cacfa7c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6ee8b871ab25b79ee67f4e5735673469d19cfb336696170957d2439b0bca13af\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-24T10:09:32Z\\\",\\\"message\\\":\\\"le observer\\\\nW0224 10:09:31.800132 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0224 10:09:31.800281 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0224 10:09:31.801147 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1923306337/tls.crt::/tmp/serving-cert-1923306337/tls.key\\\\\\\"\\\\nI0224 10:09:31.990905 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0224 10:09:31.993717 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0224 10:09:31.993747 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0224 10:09:31.993768 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0224 10:09:31.993775 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0224 10:09:31.997392 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0224 10:09:31.997421 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0224 10:09:31.997427 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0224 10:09:31.997436 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nI0224 10:09:31.997435 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0224 10:09:31.997440 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0224 10:09:31.997467 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0224 10:09:31.997470 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0224 10:09:31.999573 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-24T10:09:31Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:10:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5cf6a4886c046added0a4380f1ca023def94c15e9d5c0ed472c714632684f158\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:08:39Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8da8e0d5478ae714a89c61cf4ca4142bc433bca4d4e4749f069c706abfed8c8f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8da8e0d5478ae714a89c61cf4ca4142bc433bca4d4e4749f069c706abfed8c8f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T10:08:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T10:08:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:08:36Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:10:14Z is after 2025-08-24T17:21:41Z" Feb 24 10:10:14 crc kubenswrapper[4985]: I0224 10:10:14.939116 4985 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 24 10:10:14 crc kubenswrapper[4985]: E0224 10:10:14.939216 4985 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-24 10:10:16.939183845 +0000 UTC m=+101.413376405 (durationBeforeRetry 2s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 24 10:10:14 crc kubenswrapper[4985]: I0224 10:10:14.939270 4985 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 24 10:10:14 crc kubenswrapper[4985]: I0224 10:10:14.939329 4985 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 24 10:10:14 crc kubenswrapper[4985]: I0224 10:10:14.939357 4985 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 24 10:10:14 crc kubenswrapper[4985]: E0224 10:10:14.939481 4985 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Feb 24 10:10:14 crc kubenswrapper[4985]: E0224 10:10:14.939519 4985 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-24 10:10:16.939509344 +0000 UTC m=+101.413701904 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Feb 24 10:10:14 crc kubenswrapper[4985]: E0224 10:10:14.939913 4985 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 24 10:10:14 crc kubenswrapper[4985]: E0224 10:10:14.939928 4985 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 24 10:10:14 crc kubenswrapper[4985]: E0224 10:10:14.939938 4985 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 24 10:10:14 crc kubenswrapper[4985]: E0224 10:10:14.939962 4985 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-02-24 10:10:16.939955187 +0000 UTC m=+101.414147747 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 24 10:10:14 crc kubenswrapper[4985]: E0224 10:10:14.940023 4985 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 24 10:10:14 crc kubenswrapper[4985]: E0224 10:10:14.940060 4985 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-24 10:10:16.94003729 +0000 UTC m=+101.414229850 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 24 10:10:14 crc kubenswrapper[4985]: I0224 10:10:14.951417 4985 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-xj9h5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c4bc690c-38e6-488d-97b2-9bf37a916fe4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svj5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5aa381e3ea366641ef4fa423fdd46f8afb44b1b2a2ccb6c754007456149b75e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5aa381e3ea366641ef4fa423fdd46f8afb44b1b2a2ccb6c754007456149b75e6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T10:10:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svj5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svj5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svj5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svj5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svj5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svj5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:10:13Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-xj9h5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:10:14Z is after 2025-08-24T17:21:41Z" Feb 24 10:10:14 crc kubenswrapper[4985]: I0224 10:10:14.964750 4985 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-hq52w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"11c1c7b8-18df-4583-849f-76b62544344b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8546cf975b145528b7f94bd03c6570de0db1a059477da5795b7569250bde3ca5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:10:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kcfbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://14cb7316040a9cb3cb3236ed16ecc17eee99c6935a646b86604ea8285e1aab0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:10:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kcfbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:10:13Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-hq52w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:10:14Z is after 2025-08-24T17:21:41Z" Feb 24 10:10:14 crc kubenswrapper[4985]: I0224 10:10:14.975793 4985 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-jj7jq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4776cd42-9a5c-4c2e-a585-a1f4a49d2d6d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6f45d6400640ba9b59793abc9aee79bb1c32439ff377abdf8b4ec4d1cd7c336a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:10:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rvmf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:10:13Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-jj7jq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:10:14Z is after 2025-08-24T17:21:41Z" Feb 24 10:10:14 crc kubenswrapper[4985]: I0224 10:10:14.989364 4985 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:10:14Z is after 2025-08-24T17:21:41Z" Feb 24 10:10:15 crc kubenswrapper[4985]: I0224 10:10:15.004325 4985 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://137c9b42aa2109e16d0e273e3e7d116db30ed716c1e6f78fb8f4e87409e3f252\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:10:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://40736fa6ee59064a7fc5bed9b08671d094e7a11b6a8bf4ec45220a0d62156542\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:10:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:10:15Z is after 2025-08-24T17:21:41Z" Feb 24 10:10:15 crc kubenswrapper[4985]: I0224 10:10:15.017104 4985 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:10:15Z is after 2025-08-24T17:21:41Z" Feb 24 10:10:15 crc kubenswrapper[4985]: I0224 10:10:15.028620 4985 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-xkc65" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d4340d1a-60cb-4240-87ba-1e468c9c41cf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lvcfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lvcfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:10:13Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-xkc65\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:10:15Z is after 2025-08-24T17:21:41Z" Feb 24 10:10:15 crc kubenswrapper[4985]: I0224 10:10:15.034554 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:10:15 crc kubenswrapper[4985]: I0224 10:10:15.034769 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:10:15 crc kubenswrapper[4985]: I0224 10:10:15.034866 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:10:15 crc kubenswrapper[4985]: I0224 10:10:15.034971 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 10:10:15 crc kubenswrapper[4985]: I0224 10:10:15.035058 4985 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T10:10:15Z","lastTransitionTime":"2026-02-24T10:10:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 10:10:15 crc kubenswrapper[4985]: I0224 10:10:15.040476 4985 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 24 10:10:15 crc kubenswrapper[4985]: I0224 10:10:15.040516 4985 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d4340d1a-60cb-4240-87ba-1e468c9c41cf-metrics-certs\") pod \"network-metrics-daemon-xkc65\" (UID: \"d4340d1a-60cb-4240-87ba-1e468c9c41cf\") " pod="openshift-multus/network-metrics-daemon-xkc65" Feb 24 10:10:15 crc kubenswrapper[4985]: E0224 10:10:15.040612 4985 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Feb 24 10:10:15 crc kubenswrapper[4985]: E0224 10:10:15.040649 4985 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d4340d1a-60cb-4240-87ba-1e468c9c41cf-metrics-certs podName:d4340d1a-60cb-4240-87ba-1e468c9c41cf nodeName:}" failed. No retries permitted until 2026-02-24 10:10:17.040638154 +0000 UTC m=+101.514830714 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/d4340d1a-60cb-4240-87ba-1e468c9c41cf-metrics-certs") pod "network-metrics-daemon-xkc65" (UID: "d4340d1a-60cb-4240-87ba-1e468c9c41cf") : object "openshift-multus"/"metrics-daemon-secret" not registered Feb 24 10:10:15 crc kubenswrapper[4985]: E0224 10:10:15.040926 4985 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 24 10:10:15 crc kubenswrapper[4985]: E0224 10:10:15.040957 4985 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 24 10:10:15 crc kubenswrapper[4985]: E0224 10:10:15.040969 4985 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 24 10:10:15 crc kubenswrapper[4985]: E0224 10:10:15.041002 4985 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-02-24 10:10:17.040993854 +0000 UTC m=+101.515186414 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 24 10:10:15 crc kubenswrapper[4985]: I0224 10:10:15.063746 4985 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-27dpt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1b3986ef-e9be-43db-9350-ccc7dd3f713f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9n6c9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9n6c9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9n6c9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9n6c9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9n6c9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9n6c9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9n6c9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9n6c9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8de70126733b3634f4e3c1576c3095c089e9628a6147ce8ea79fa5242a2eb920\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8de70126733b3634f4e3c1576c3095c089e9628a6147ce8ea79fa5242a2eb920\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T10:10:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9n6c9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:10:13Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-27dpt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:10:15Z is after 2025-08-24T17:21:41Z" Feb 24 10:10:15 crc kubenswrapper[4985]: I0224 10:10:15.075341 4985 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-qtqms" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c43e3252-1b22-48a0-8895-ad98fc88b7cf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://778506fd4d9af7b6ebd4e4d46282938146ba0184d9747ca657957c78118c10e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:10:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jdldr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:10:13Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-qtqms\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:10:15Z is after 2025-08-24T17:21:41Z" Feb 24 10:10:15 crc kubenswrapper[4985]: I0224 10:10:15.086988 4985 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c9e0873cd1521fe3cd14005f22118db4616043670f4b895435e71ddec9d82a0e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:10:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:10:15Z is after 2025-08-24T17:21:41Z" Feb 24 10:10:15 crc kubenswrapper[4985]: I0224 10:10:15.098984 4985 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-kkmm8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c3a34c00-910b-400b-bc96-9d805e076b7c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://65682d7b02ccbc0c344ab0b8eaf55d227d0b6e4a04e56d1038d51aa30747b068\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:10:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2zg8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://52f2dbaf1dfa5fd40bbf8a0d82613a96b05afccc71653f6d1f24db6674950b58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:10:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2zg8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:10:13Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-kkmm8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:10:15Z is after 2025-08-24T17:21:41Z" Feb 24 10:10:15 crc kubenswrapper[4985]: I0224 10:10:15.115139 4985 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:10:15Z is after 2025-08-24T17:21:41Z" Feb 24 10:10:15 crc kubenswrapper[4985]: I0224 10:10:15.138183 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:10:15 crc kubenswrapper[4985]: I0224 10:10:15.138392 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:10:15 crc kubenswrapper[4985]: I0224 10:10:15.138473 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:10:15 crc kubenswrapper[4985]: I0224 10:10:15.138554 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 10:10:15 crc kubenswrapper[4985]: I0224 10:10:15.138741 4985 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T10:10:15Z","lastTransitionTime":"2026-02-24T10:10:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 10:10:15 crc kubenswrapper[4985]: I0224 10:10:15.140754 4985 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:10:15Z is after 2025-08-24T17:21:41Z" Feb 24 10:10:15 crc kubenswrapper[4985]: I0224 10:10:15.155134 4985 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-q24bf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"731349d2-7b07-4bc9-81f8-c7d75bca842a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://85e965ad144b3440864b3c9de70aacd2a2e49c715dca204b0e4cfae65954cd9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:10:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5g9sh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:10:13Z\\\"}}\" for pod \"openshift-multus\"/\"multus-q24bf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:10:15Z is after 2025-08-24T17:21:41Z" Feb 24 10:10:15 crc kubenswrapper[4985]: I0224 10:10:15.241867 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:10:15 crc kubenswrapper[4985]: I0224 10:10:15.241948 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:10:15 crc kubenswrapper[4985]: I0224 10:10:15.241974 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:10:15 crc kubenswrapper[4985]: I0224 10:10:15.241996 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 10:10:15 crc kubenswrapper[4985]: I0224 10:10:15.242057 4985 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T10:10:15Z","lastTransitionTime":"2026-02-24T10:10:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 10:10:15 crc kubenswrapper[4985]: I0224 10:10:15.263871 4985 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xkc65" Feb 24 10:10:15 crc kubenswrapper[4985]: I0224 10:10:15.263964 4985 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 24 10:10:15 crc kubenswrapper[4985]: E0224 10:10:15.264007 4985 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xkc65" podUID="d4340d1a-60cb-4240-87ba-1e468c9c41cf" Feb 24 10:10:15 crc kubenswrapper[4985]: E0224 10:10:15.264104 4985 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 24 10:10:15 crc kubenswrapper[4985]: I0224 10:10:15.264181 4985 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 24 10:10:15 crc kubenswrapper[4985]: E0224 10:10:15.264227 4985 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 24 10:10:15 crc kubenswrapper[4985]: I0224 10:10:15.264292 4985 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 24 10:10:15 crc kubenswrapper[4985]: E0224 10:10:15.264363 4985 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 24 10:10:15 crc kubenswrapper[4985]: I0224 10:10:15.343444 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:10:15 crc kubenswrapper[4985]: I0224 10:10:15.343481 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:10:15 crc kubenswrapper[4985]: I0224 10:10:15.343492 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:10:15 crc kubenswrapper[4985]: I0224 10:10:15.343506 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 10:10:15 crc kubenswrapper[4985]: I0224 10:10:15.343515 4985 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T10:10:15Z","lastTransitionTime":"2026-02-24T10:10:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 10:10:15 crc kubenswrapper[4985]: I0224 10:10:15.446086 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:10:15 crc kubenswrapper[4985]: I0224 10:10:15.446387 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:10:15 crc kubenswrapper[4985]: I0224 10:10:15.446399 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:10:15 crc kubenswrapper[4985]: I0224 10:10:15.446417 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 10:10:15 crc kubenswrapper[4985]: I0224 10:10:15.446431 4985 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T10:10:15Z","lastTransitionTime":"2026-02-24T10:10:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 10:10:15 crc kubenswrapper[4985]: I0224 10:10:15.549555 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:10:15 crc kubenswrapper[4985]: I0224 10:10:15.549595 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:10:15 crc kubenswrapper[4985]: I0224 10:10:15.549606 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:10:15 crc kubenswrapper[4985]: I0224 10:10:15.549623 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 10:10:15 crc kubenswrapper[4985]: I0224 10:10:15.549634 4985 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T10:10:15Z","lastTransitionTime":"2026-02-24T10:10:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 10:10:15 crc kubenswrapper[4985]: I0224 10:10:15.651392 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:10:15 crc kubenswrapper[4985]: I0224 10:10:15.651434 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:10:15 crc kubenswrapper[4985]: I0224 10:10:15.651442 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:10:15 crc kubenswrapper[4985]: I0224 10:10:15.651458 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 10:10:15 crc kubenswrapper[4985]: I0224 10:10:15.651468 4985 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T10:10:15Z","lastTransitionTime":"2026-02-24T10:10:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 10:10:15 crc kubenswrapper[4985]: I0224 10:10:15.694571 4985 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-27dpt" event={"ID":"1b3986ef-e9be-43db-9350-ccc7dd3f713f","Type":"ContainerStarted","Data":"cb7b5b1f27320e4bc00e48154c3bb41b1153e36c39a404cfc1d4a246f5909c5d"} Feb 24 10:10:15 crc kubenswrapper[4985]: I0224 10:10:15.694619 4985 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-27dpt" event={"ID":"1b3986ef-e9be-43db-9350-ccc7dd3f713f","Type":"ContainerStarted","Data":"af211a7b5dbd1e1f86f340d63afe03ad2f9691d166810837cc5988a24ad4f323"} Feb 24 10:10:15 crc kubenswrapper[4985]: I0224 10:10:15.694636 4985 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-27dpt" event={"ID":"1b3986ef-e9be-43db-9350-ccc7dd3f713f","Type":"ContainerStarted","Data":"97f91c2ddd00ceb58ea5691c7cb057bd7cfe8add5fd1dfd334f39b9def30db0d"} Feb 24 10:10:15 crc kubenswrapper[4985]: I0224 10:10:15.694649 4985 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-27dpt" event={"ID":"1b3986ef-e9be-43db-9350-ccc7dd3f713f","Type":"ContainerStarted","Data":"e04b766ff2b4eb366ace7cbf978ef84bfa922494ec3d28f4cbd75efe1cb54f1e"} Feb 24 10:10:15 crc kubenswrapper[4985]: I0224 10:10:15.694660 4985 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-27dpt" event={"ID":"1b3986ef-e9be-43db-9350-ccc7dd3f713f","Type":"ContainerStarted","Data":"7630d2e046fcee78924010a8de1aefcdf7c24ef4e70c4ea89c193983a0d88a61"} Feb 24 10:10:15 crc kubenswrapper[4985]: I0224 10:10:15.696579 4985 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-xj9h5" event={"ID":"c4bc690c-38e6-488d-97b2-9bf37a916fe4","Type":"ContainerStarted","Data":"e071fe863c4e9eb6ebecb3d58d1c4a3c66577cbe46568a4d80a5de5663636539"} Feb 24 10:10:15 crc kubenswrapper[4985]: I0224 10:10:15.712055 4985 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:10:15Z is after 2025-08-24T17:21:41Z" Feb 24 10:10:15 crc kubenswrapper[4985]: I0224 10:10:15.727532 4985 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:10:15Z is after 2025-08-24T17:21:41Z" Feb 24 10:10:15 crc kubenswrapper[4985]: I0224 10:10:15.740443 4985 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-q24bf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"731349d2-7b07-4bc9-81f8-c7d75bca842a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://85e965ad144b3440864b3c9de70aacd2a2e49c715dca204b0e4cfae65954cd9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:10:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5g9sh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:10:13Z\\\"}}\" for pod \"openshift-multus\"/\"multus-q24bf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:10:15Z is after 2025-08-24T17:21:41Z" Feb 24 10:10:15 crc kubenswrapper[4985]: I0224 10:10:15.751464 4985 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-kkmm8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c3a34c00-910b-400b-bc96-9d805e076b7c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://65682d7b02ccbc0c344ab0b8eaf55d227d0b6e4a04e56d1038d51aa30747b068\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:10:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2zg8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://52f2dbaf1dfa5fd40bbf8a0d82613a96b05afccc71653f6d1f24db6674950b58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:10:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2zg8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:10:13Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-kkmm8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:10:15Z is after 2025-08-24T17:21:41Z" Feb 24 10:10:15 crc kubenswrapper[4985]: I0224 10:10:15.753165 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:10:15 crc kubenswrapper[4985]: I0224 10:10:15.753199 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:10:15 crc kubenswrapper[4985]: I0224 10:10:15.753208 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:10:15 crc kubenswrapper[4985]: I0224 10:10:15.753223 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 10:10:15 crc kubenswrapper[4985]: I0224 10:10:15.753233 4985 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T10:10:15Z","lastTransitionTime":"2026-02-24T10:10:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 10:10:15 crc kubenswrapper[4985]: I0224 10:10:15.764100 4985 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-xj9h5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c4bc690c-38e6-488d-97b2-9bf37a916fe4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svj5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5aa381e3ea366641ef4fa423fdd46f8afb44b1b2a2ccb6c754007456149b75e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5aa381e3ea366641ef4fa423fdd46f8afb44b1b2a2ccb6c754007456149b75e6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T10:10:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svj5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e071fe863c4e9eb6ebecb3d58d1c4a3c66577cbe46568a4d80a5de5663636539\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:10:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svj5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svj5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svj5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svj5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svj5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:10:13Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-xj9h5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:10:15Z is after 2025-08-24T17:21:41Z" Feb 24 10:10:15 crc kubenswrapper[4985]: I0224 10:10:15.774536 4985 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-hq52w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"11c1c7b8-18df-4583-849f-76b62544344b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8546cf975b145528b7f94bd03c6570de0db1a059477da5795b7569250bde3ca5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:10:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kcfbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://14cb7316040a9cb3cb3236ed16ecc17eee99c6935a646b86604ea8285e1aab0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:10:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kcfbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:10:13Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-hq52w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:10:15Z is after 2025-08-24T17:21:41Z" Feb 24 10:10:15 crc kubenswrapper[4985]: I0224 10:10:15.783236 4985 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-jj7jq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4776cd42-9a5c-4c2e-a585-a1f4a49d2d6d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6f45d6400640ba9b59793abc9aee79bb1c32439ff377abdf8b4ec4d1cd7c336a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:10:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rvmf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:10:13Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-jj7jq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:10:15Z is after 2025-08-24T17:21:41Z" Feb 24 10:10:15 crc kubenswrapper[4985]: I0224 10:10:15.795919 4985 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a6395bdd-0d8e-4572-ae86-695e87aff12e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:08:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:08:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:08:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:08:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:08:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f9134e7cde3d6c701f928934ed4de91ee13bac8bc62dde5df6fe535a219fb57\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:08:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba53e3e1dfd92b0155e60254f21e53e06da9929560934d7bd95624f4a2bb619c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:08:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d059ea80117284da8c7eb82e4cd878eabba8f9d96102965708b9f6e0f8e96eb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:08:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dea2225b9a0112117f2f0bd6038ef636ac20e0cfcf3d608b720ad7aa2cacfa7c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6ee8b871ab25b79ee67f4e5735673469d19cfb336696170957d2439b0bca13af\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-24T10:09:32Z\\\",\\\"message\\\":\\\"le observer\\\\nW0224 10:09:31.800132 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0224 10:09:31.800281 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0224 10:09:31.801147 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1923306337/tls.crt::/tmp/serving-cert-1923306337/tls.key\\\\\\\"\\\\nI0224 10:09:31.990905 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0224 10:09:31.993717 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0224 10:09:31.993747 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0224 10:09:31.993768 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0224 10:09:31.993775 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0224 10:09:31.997392 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0224 10:09:31.997421 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0224 10:09:31.997427 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0224 10:09:31.997436 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nI0224 10:09:31.997435 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0224 10:09:31.997440 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0224 10:09:31.997467 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0224 10:09:31.997470 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0224 10:09:31.999573 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-24T10:09:31Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:10:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5cf6a4886c046added0a4380f1ca023def94c15e9d5c0ed472c714632684f158\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:08:39Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8da8e0d5478ae714a89c61cf4ca4142bc433bca4d4e4749f069c706abfed8c8f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8da8e0d5478ae714a89c61cf4ca4142bc433bca4d4e4749f069c706abfed8c8f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T10:08:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T10:08:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:08:36Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:10:15Z is after 2025-08-24T17:21:41Z" Feb 24 10:10:15 crc kubenswrapper[4985]: I0224 10:10:15.808245 4985 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://137c9b42aa2109e16d0e273e3e7d116db30ed716c1e6f78fb8f4e87409e3f252\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:10:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://40736fa6ee59064a7fc5bed9b08671d094e7a11b6a8bf4ec45220a0d62156542\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:10:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:10:15Z is after 2025-08-24T17:21:41Z" Feb 24 10:10:15 crc kubenswrapper[4985]: I0224 10:10:15.819494 4985 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:10:15Z is after 2025-08-24T17:21:41Z" Feb 24 10:10:15 crc kubenswrapper[4985]: I0224 10:10:15.830995 4985 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c9e0873cd1521fe3cd14005f22118db4616043670f4b895435e71ddec9d82a0e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:10:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:10:15Z is after 2025-08-24T17:21:41Z" Feb 24 10:10:15 crc kubenswrapper[4985]: I0224 10:10:15.841359 4985 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:10:15Z is after 2025-08-24T17:21:41Z" Feb 24 10:10:15 crc kubenswrapper[4985]: I0224 10:10:15.850372 4985 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-xkc65" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d4340d1a-60cb-4240-87ba-1e468c9c41cf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lvcfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lvcfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:10:13Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-xkc65\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:10:15Z is after 2025-08-24T17:21:41Z" Feb 24 10:10:15 crc kubenswrapper[4985]: I0224 10:10:15.855399 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:10:15 crc kubenswrapper[4985]: I0224 10:10:15.855443 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:10:15 crc kubenswrapper[4985]: I0224 10:10:15.855453 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:10:15 crc kubenswrapper[4985]: I0224 10:10:15.855470 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 10:10:15 crc kubenswrapper[4985]: I0224 10:10:15.855486 4985 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T10:10:15Z","lastTransitionTime":"2026-02-24T10:10:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 10:10:15 crc kubenswrapper[4985]: I0224 10:10:15.869701 4985 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-27dpt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1b3986ef-e9be-43db-9350-ccc7dd3f713f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9n6c9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9n6c9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9n6c9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9n6c9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9n6c9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9n6c9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9n6c9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9n6c9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8de70126733b3634f4e3c1576c3095c089e9628a6147ce8ea79fa5242a2eb920\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8de70126733b3634f4e3c1576c3095c089e9628a6147ce8ea79fa5242a2eb920\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T10:10:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9n6c9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:10:13Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-27dpt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:10:15Z is after 2025-08-24T17:21:41Z" Feb 24 10:10:15 crc kubenswrapper[4985]: I0224 10:10:15.878702 4985 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-qtqms" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c43e3252-1b22-48a0-8895-ad98fc88b7cf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://778506fd4d9af7b6ebd4e4d46282938146ba0184d9747ca657957c78118c10e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:10:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jdldr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:10:13Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-qtqms\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:10:15Z is after 2025-08-24T17:21:41Z" Feb 24 10:10:15 crc kubenswrapper[4985]: I0224 10:10:15.958092 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:10:15 crc kubenswrapper[4985]: I0224 10:10:15.958132 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:10:15 crc kubenswrapper[4985]: I0224 10:10:15.958142 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:10:15 crc kubenswrapper[4985]: I0224 10:10:15.958157 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 10:10:15 crc kubenswrapper[4985]: I0224 10:10:15.958166 4985 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T10:10:15Z","lastTransitionTime":"2026-02-24T10:10:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 10:10:16 crc kubenswrapper[4985]: I0224 10:10:16.060161 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:10:16 crc kubenswrapper[4985]: I0224 10:10:16.060188 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:10:16 crc kubenswrapper[4985]: I0224 10:10:16.060196 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:10:16 crc kubenswrapper[4985]: I0224 10:10:16.060208 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 10:10:16 crc kubenswrapper[4985]: I0224 10:10:16.060218 4985 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T10:10:16Z","lastTransitionTime":"2026-02-24T10:10:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 10:10:16 crc kubenswrapper[4985]: I0224 10:10:16.162924 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:10:16 crc kubenswrapper[4985]: I0224 10:10:16.162960 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:10:16 crc kubenswrapper[4985]: I0224 10:10:16.162971 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:10:16 crc kubenswrapper[4985]: I0224 10:10:16.162988 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 10:10:16 crc kubenswrapper[4985]: I0224 10:10:16.163000 4985 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T10:10:16Z","lastTransitionTime":"2026-02-24T10:10:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 10:10:16 crc kubenswrapper[4985]: I0224 10:10:16.265537 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:10:16 crc kubenswrapper[4985]: I0224 10:10:16.265628 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:10:16 crc kubenswrapper[4985]: I0224 10:10:16.265652 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:10:16 crc kubenswrapper[4985]: I0224 10:10:16.265667 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 10:10:16 crc kubenswrapper[4985]: I0224 10:10:16.265676 4985 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T10:10:16Z","lastTransitionTime":"2026-02-24T10:10:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 10:10:16 crc kubenswrapper[4985]: I0224 10:10:16.275571 4985 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:10:16Z is after 2025-08-24T17:21:41Z" Feb 24 10:10:16 crc kubenswrapper[4985]: I0224 10:10:16.290475 4985 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:10:16Z is after 2025-08-24T17:21:41Z" Feb 24 10:10:16 crc kubenswrapper[4985]: I0224 10:10:16.302698 4985 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-q24bf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"731349d2-7b07-4bc9-81f8-c7d75bca842a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://85e965ad144b3440864b3c9de70aacd2a2e49c715dca204b0e4cfae65954cd9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:10:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5g9sh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:10:13Z\\\"}}\" for pod \"openshift-multus\"/\"multus-q24bf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:10:16Z is after 2025-08-24T17:21:41Z" Feb 24 10:10:16 crc kubenswrapper[4985]: I0224 10:10:16.312194 4985 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-kkmm8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c3a34c00-910b-400b-bc96-9d805e076b7c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://65682d7b02ccbc0c344ab0b8eaf55d227d0b6e4a04e56d1038d51aa30747b068\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:10:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2zg8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://52f2dbaf1dfa5fd40bbf8a0d82613a96b05afccc71653f6d1f24db6674950b58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:10:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2zg8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:10:13Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-kkmm8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:10:16Z is after 2025-08-24T17:21:41Z" Feb 24 10:10:16 crc kubenswrapper[4985]: I0224 10:10:16.323080 4985 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a6395bdd-0d8e-4572-ae86-695e87aff12e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:08:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:08:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:08:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:08:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:08:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f9134e7cde3d6c701f928934ed4de91ee13bac8bc62dde5df6fe535a219fb57\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:08:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba53e3e1dfd92b0155e60254f21e53e06da9929560934d7bd95624f4a2bb619c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:08:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d059ea80117284da8c7eb82e4cd878eabba8f9d96102965708b9f6e0f8e96eb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:08:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dea2225b9a0112117f2f0bd6038ef636ac20e0cfcf3d608b720ad7aa2cacfa7c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6ee8b871ab25b79ee67f4e5735673469d19cfb336696170957d2439b0bca13af\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-24T10:09:32Z\\\",\\\"message\\\":\\\"le observer\\\\nW0224 10:09:31.800132 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0224 10:09:31.800281 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0224 10:09:31.801147 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1923306337/tls.crt::/tmp/serving-cert-1923306337/tls.key\\\\\\\"\\\\nI0224 10:09:31.990905 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0224 10:09:31.993717 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0224 10:09:31.993747 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0224 10:09:31.993768 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0224 10:09:31.993775 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0224 10:09:31.997392 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0224 10:09:31.997421 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0224 10:09:31.997427 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0224 10:09:31.997436 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nI0224 10:09:31.997435 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0224 10:09:31.997440 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0224 10:09:31.997467 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0224 10:09:31.997470 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0224 10:09:31.999573 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-24T10:09:31Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:10:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5cf6a4886c046added0a4380f1ca023def94c15e9d5c0ed472c714632684f158\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:08:39Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8da8e0d5478ae714a89c61cf4ca4142bc433bca4d4e4749f069c706abfed8c8f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8da8e0d5478ae714a89c61cf4ca4142bc433bca4d4e4749f069c706abfed8c8f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T10:08:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T10:08:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:08:36Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:10:16Z is after 2025-08-24T17:21:41Z" Feb 24 10:10:16 crc kubenswrapper[4985]: I0224 10:10:16.334943 4985 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-xj9h5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c4bc690c-38e6-488d-97b2-9bf37a916fe4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svj5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5aa381e3ea366641ef4fa423fdd46f8afb44b1b2a2ccb6c754007456149b75e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5aa381e3ea366641ef4fa423fdd46f8afb44b1b2a2ccb6c754007456149b75e6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T10:10:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svj5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e071fe863c4e9eb6ebecb3d58d1c4a3c66577cbe46568a4d80a5de5663636539\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:10:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svj5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svj5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svj5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svj5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svj5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:10:13Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-xj9h5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:10:16Z is after 2025-08-24T17:21:41Z" Feb 24 10:10:16 crc kubenswrapper[4985]: I0224 10:10:16.343353 4985 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-hq52w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"11c1c7b8-18df-4583-849f-76b62544344b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8546cf975b145528b7f94bd03c6570de0db1a059477da5795b7569250bde3ca5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:10:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kcfbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://14cb7316040a9cb3cb3236ed16ecc17eee99c6935a646b86604ea8285e1aab0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:10:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kcfbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:10:13Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-hq52w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:10:16Z is after 2025-08-24T17:21:41Z" Feb 24 10:10:16 crc kubenswrapper[4985]: I0224 10:10:16.350554 4985 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-jj7jq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4776cd42-9a5c-4c2e-a585-a1f4a49d2d6d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6f45d6400640ba9b59793abc9aee79bb1c32439ff377abdf8b4ec4d1cd7c336a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:10:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rvmf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:10:13Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-jj7jq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:10:16Z is after 2025-08-24T17:21:41Z" Feb 24 10:10:16 crc kubenswrapper[4985]: I0224 10:10:16.364668 4985 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:10:16Z is after 2025-08-24T17:21:41Z" Feb 24 10:10:16 crc kubenswrapper[4985]: I0224 10:10:16.367737 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:10:16 crc kubenswrapper[4985]: I0224 10:10:16.367769 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:10:16 crc kubenswrapper[4985]: I0224 10:10:16.367777 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:10:16 crc kubenswrapper[4985]: I0224 10:10:16.367793 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 10:10:16 crc kubenswrapper[4985]: I0224 10:10:16.367802 4985 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T10:10:16Z","lastTransitionTime":"2026-02-24T10:10:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 10:10:16 crc kubenswrapper[4985]: I0224 10:10:16.376552 4985 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://137c9b42aa2109e16d0e273e3e7d116db30ed716c1e6f78fb8f4e87409e3f252\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:10:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://40736fa6ee59064a7fc5bed9b08671d094e7a11b6a8bf4ec45220a0d62156542\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:10:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:10:16Z is after 2025-08-24T17:21:41Z" Feb 24 10:10:16 crc kubenswrapper[4985]: I0224 10:10:16.385261 4985 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-qtqms" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c43e3252-1b22-48a0-8895-ad98fc88b7cf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://778506fd4d9af7b6ebd4e4d46282938146ba0184d9747ca657957c78118c10e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:10:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jdldr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:10:13Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-qtqms\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:10:16Z is after 2025-08-24T17:21:41Z" Feb 24 10:10:16 crc kubenswrapper[4985]: I0224 10:10:16.396069 4985 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c9e0873cd1521fe3cd14005f22118db4616043670f4b895435e71ddec9d82a0e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:10:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:10:16Z is after 2025-08-24T17:21:41Z" Feb 24 10:10:16 crc kubenswrapper[4985]: I0224 10:10:16.405227 4985 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:10:16Z is after 2025-08-24T17:21:41Z" Feb 24 10:10:16 crc kubenswrapper[4985]: I0224 10:10:16.412986 4985 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-xkc65" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d4340d1a-60cb-4240-87ba-1e468c9c41cf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lvcfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lvcfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:10:13Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-xkc65\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:10:16Z is after 2025-08-24T17:21:41Z" Feb 24 10:10:16 crc kubenswrapper[4985]: I0224 10:10:16.427755 4985 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-27dpt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1b3986ef-e9be-43db-9350-ccc7dd3f713f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9n6c9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9n6c9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9n6c9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9n6c9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9n6c9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9n6c9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9n6c9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9n6c9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8de70126733b3634f4e3c1576c3095c089e9628a6147ce8ea79fa5242a2eb920\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8de70126733b3634f4e3c1576c3095c089e9628a6147ce8ea79fa5242a2eb920\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T10:10:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9n6c9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:10:13Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-27dpt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:10:16Z is after 2025-08-24T17:21:41Z" Feb 24 10:10:16 crc kubenswrapper[4985]: I0224 10:10:16.470331 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:10:16 crc kubenswrapper[4985]: I0224 10:10:16.470397 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:10:16 crc kubenswrapper[4985]: I0224 10:10:16.470409 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:10:16 crc kubenswrapper[4985]: I0224 10:10:16.470435 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 10:10:16 crc kubenswrapper[4985]: I0224 10:10:16.470449 4985 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T10:10:16Z","lastTransitionTime":"2026-02-24T10:10:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 10:10:16 crc kubenswrapper[4985]: I0224 10:10:16.573604 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:10:16 crc kubenswrapper[4985]: I0224 10:10:16.573660 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:10:16 crc kubenswrapper[4985]: I0224 10:10:16.573673 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:10:16 crc kubenswrapper[4985]: I0224 10:10:16.573692 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 10:10:16 crc kubenswrapper[4985]: I0224 10:10:16.573706 4985 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T10:10:16Z","lastTransitionTime":"2026-02-24T10:10:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 10:10:16 crc kubenswrapper[4985]: I0224 10:10:16.676658 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:10:16 crc kubenswrapper[4985]: I0224 10:10:16.676741 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:10:16 crc kubenswrapper[4985]: I0224 10:10:16.676765 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:10:16 crc kubenswrapper[4985]: I0224 10:10:16.676797 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 10:10:16 crc kubenswrapper[4985]: I0224 10:10:16.676824 4985 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T10:10:16Z","lastTransitionTime":"2026-02-24T10:10:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 10:10:16 crc kubenswrapper[4985]: I0224 10:10:16.702358 4985 generic.go:334] "Generic (PLEG): container finished" podID="c4bc690c-38e6-488d-97b2-9bf37a916fe4" containerID="e071fe863c4e9eb6ebecb3d58d1c4a3c66577cbe46568a4d80a5de5663636539" exitCode=0 Feb 24 10:10:16 crc kubenswrapper[4985]: I0224 10:10:16.702405 4985 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-xj9h5" event={"ID":"c4bc690c-38e6-488d-97b2-9bf37a916fe4","Type":"ContainerDied","Data":"e071fe863c4e9eb6ebecb3d58d1c4a3c66577cbe46568a4d80a5de5663636539"} Feb 24 10:10:16 crc kubenswrapper[4985]: I0224 10:10:16.705204 4985 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"e2f9d78f6a52c55d1f70fa119f84fdb11505d511d394a8ed61a298e79dfb2dcd"} Feb 24 10:10:16 crc kubenswrapper[4985]: I0224 10:10:16.710640 4985 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-27dpt" event={"ID":"1b3986ef-e9be-43db-9350-ccc7dd3f713f","Type":"ContainerStarted","Data":"3d76cc1e57df7809a67ce69b6690fd8cbe0ef9c364406b03bbb37bd110d65866"} Feb 24 10:10:16 crc kubenswrapper[4985]: I0224 10:10:16.720852 4985 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:10:16Z is after 2025-08-24T17:21:41Z" Feb 24 10:10:16 crc kubenswrapper[4985]: I0224 10:10:16.736079 4985 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:10:16Z is after 2025-08-24T17:21:41Z" Feb 24 10:10:16 crc kubenswrapper[4985]: I0224 10:10:16.756160 4985 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-q24bf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"731349d2-7b07-4bc9-81f8-c7d75bca842a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://85e965ad144b3440864b3c9de70aacd2a2e49c715dca204b0e4cfae65954cd9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:10:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5g9sh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:10:13Z\\\"}}\" for pod \"openshift-multus\"/\"multus-q24bf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:10:16Z is after 2025-08-24T17:21:41Z" Feb 24 10:10:16 crc kubenswrapper[4985]: I0224 10:10:16.770632 4985 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-kkmm8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c3a34c00-910b-400b-bc96-9d805e076b7c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://65682d7b02ccbc0c344ab0b8eaf55d227d0b6e4a04e56d1038d51aa30747b068\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:10:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2zg8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://52f2dbaf1dfa5fd40bbf8a0d82613a96b05afccc71653f6d1f24db6674950b58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:10:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2zg8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:10:13Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-kkmm8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:10:16Z is after 2025-08-24T17:21:41Z" Feb 24 10:10:16 crc kubenswrapper[4985]: I0224 10:10:16.779325 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:10:16 crc kubenswrapper[4985]: I0224 10:10:16.779353 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:10:16 crc kubenswrapper[4985]: I0224 10:10:16.779363 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:10:16 crc kubenswrapper[4985]: I0224 10:10:16.779376 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 10:10:16 crc kubenswrapper[4985]: I0224 10:10:16.779385 4985 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T10:10:16Z","lastTransitionTime":"2026-02-24T10:10:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 10:10:16 crc kubenswrapper[4985]: I0224 10:10:16.783584 4985 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a6395bdd-0d8e-4572-ae86-695e87aff12e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:08:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:08:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:08:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:08:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:08:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f9134e7cde3d6c701f928934ed4de91ee13bac8bc62dde5df6fe535a219fb57\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:08:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba53e3e1dfd92b0155e60254f21e53e06da9929560934d7bd95624f4a2bb619c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:08:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d059ea80117284da8c7eb82e4cd878eabba8f9d96102965708b9f6e0f8e96eb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:08:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dea2225b9a0112117f2f0bd6038ef636ac20e0cfcf3d608b720ad7aa2cacfa7c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6ee8b871ab25b79ee67f4e5735673469d19cfb336696170957d2439b0bca13af\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-24T10:09:32Z\\\",\\\"message\\\":\\\"le observer\\\\nW0224 10:09:31.800132 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0224 10:09:31.800281 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0224 10:09:31.801147 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1923306337/tls.crt::/tmp/serving-cert-1923306337/tls.key\\\\\\\"\\\\nI0224 10:09:31.990905 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0224 10:09:31.993717 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0224 10:09:31.993747 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0224 10:09:31.993768 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0224 10:09:31.993775 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0224 10:09:31.997392 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0224 10:09:31.997421 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0224 10:09:31.997427 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0224 10:09:31.997436 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nI0224 10:09:31.997435 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0224 10:09:31.997440 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0224 10:09:31.997467 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0224 10:09:31.997470 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0224 10:09:31.999573 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-24T10:09:31Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:10:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5cf6a4886c046added0a4380f1ca023def94c15e9d5c0ed472c714632684f158\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:08:39Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8da8e0d5478ae714a89c61cf4ca4142bc433bca4d4e4749f069c706abfed8c8f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8da8e0d5478ae714a89c61cf4ca4142bc433bca4d4e4749f069c706abfed8c8f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T10:08:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T10:08:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:08:36Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:10:16Z is after 2025-08-24T17:21:41Z" Feb 24 10:10:16 crc kubenswrapper[4985]: I0224 10:10:16.797696 4985 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-xj9h5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c4bc690c-38e6-488d-97b2-9bf37a916fe4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"message\\\":\\\"containers with incomplete status: [bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svj5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5aa381e3ea366641ef4fa423fdd46f8afb44b1b2a2ccb6c754007456149b75e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5aa381e3ea366641ef4fa423fdd46f8afb44b1b2a2ccb6c754007456149b75e6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T10:10:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svj5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e071fe863c4e9eb6ebecb3d58d1c4a3c66577cbe46568a4d80a5de5663636539\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e071fe863c4e9eb6ebecb3d58d1c4a3c66577cbe46568a4d80a5de5663636539\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T10:10:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T10:10:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svj5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svj5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svj5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svj5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svj5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:10:13Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-xj9h5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:10:16Z is after 2025-08-24T17:21:41Z" Feb 24 10:10:16 crc kubenswrapper[4985]: I0224 10:10:16.807996 4985 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-hq52w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"11c1c7b8-18df-4583-849f-76b62544344b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8546cf975b145528b7f94bd03c6570de0db1a059477da5795b7569250bde3ca5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:10:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kcfbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://14cb7316040a9cb3cb3236ed16ecc17eee99c6935a646b86604ea8285e1aab0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:10:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kcfbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:10:13Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-hq52w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:10:16Z is after 2025-08-24T17:21:41Z" Feb 24 10:10:16 crc kubenswrapper[4985]: I0224 10:10:16.818911 4985 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-jj7jq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4776cd42-9a5c-4c2e-a585-a1f4a49d2d6d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6f45d6400640ba9b59793abc9aee79bb1c32439ff377abdf8b4ec4d1cd7c336a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:10:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rvmf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:10:13Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-jj7jq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:10:16Z is after 2025-08-24T17:21:41Z" Feb 24 10:10:16 crc kubenswrapper[4985]: I0224 10:10:16.831322 4985 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:10:16Z is after 2025-08-24T17:21:41Z" Feb 24 10:10:16 crc kubenswrapper[4985]: I0224 10:10:16.843199 4985 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://137c9b42aa2109e16d0e273e3e7d116db30ed716c1e6f78fb8f4e87409e3f252\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:10:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://40736fa6ee59064a7fc5bed9b08671d094e7a11b6a8bf4ec45220a0d62156542\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:10:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:10:16Z is after 2025-08-24T17:21:41Z" Feb 24 10:10:16 crc kubenswrapper[4985]: I0224 10:10:16.866738 4985 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-27dpt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1b3986ef-e9be-43db-9350-ccc7dd3f713f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9n6c9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9n6c9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9n6c9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9n6c9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9n6c9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9n6c9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9n6c9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9n6c9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8de70126733b3634f4e3c1576c3095c089e9628a6147ce8ea79fa5242a2eb920\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8de70126733b3634f4e3c1576c3095c089e9628a6147ce8ea79fa5242a2eb920\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T10:10:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9n6c9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:10:13Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-27dpt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:10:16Z is after 2025-08-24T17:21:41Z" Feb 24 10:10:16 crc kubenswrapper[4985]: I0224 10:10:16.878473 4985 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-qtqms" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c43e3252-1b22-48a0-8895-ad98fc88b7cf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://778506fd4d9af7b6ebd4e4d46282938146ba0184d9747ca657957c78118c10e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:10:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jdldr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:10:13Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-qtqms\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:10:16Z is after 2025-08-24T17:21:41Z" Feb 24 10:10:16 crc kubenswrapper[4985]: I0224 10:10:16.881452 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:10:16 crc kubenswrapper[4985]: I0224 10:10:16.881514 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:10:16 crc kubenswrapper[4985]: I0224 10:10:16.881526 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:10:16 crc kubenswrapper[4985]: I0224 10:10:16.881545 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 10:10:16 crc kubenswrapper[4985]: I0224 10:10:16.881556 4985 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T10:10:16Z","lastTransitionTime":"2026-02-24T10:10:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 10:10:16 crc kubenswrapper[4985]: I0224 10:10:16.890479 4985 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c9e0873cd1521fe3cd14005f22118db4616043670f4b895435e71ddec9d82a0e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:10:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:10:16Z is after 2025-08-24T17:21:41Z" Feb 24 10:10:16 crc kubenswrapper[4985]: I0224 10:10:16.902860 4985 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:10:16Z is after 2025-08-24T17:21:41Z" Feb 24 10:10:16 crc kubenswrapper[4985]: I0224 10:10:16.913771 4985 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-xkc65" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d4340d1a-60cb-4240-87ba-1e468c9c41cf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lvcfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lvcfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:10:13Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-xkc65\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:10:16Z is after 2025-08-24T17:21:41Z" Feb 24 10:10:16 crc kubenswrapper[4985]: I0224 10:10:16.925571 4985 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:10:16Z is after 2025-08-24T17:21:41Z" Feb 24 10:10:16 crc kubenswrapper[4985]: I0224 10:10:16.936313 4985 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://137c9b42aa2109e16d0e273e3e7d116db30ed716c1e6f78fb8f4e87409e3f252\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:10:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://40736fa6ee59064a7fc5bed9b08671d094e7a11b6a8bf4ec45220a0d62156542\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:10:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:10:16Z is after 2025-08-24T17:21:41Z" Feb 24 10:10:16 crc kubenswrapper[4985]: I0224 10:10:16.949160 4985 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c9e0873cd1521fe3cd14005f22118db4616043670f4b895435e71ddec9d82a0e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:10:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:10:16Z is after 2025-08-24T17:21:41Z" Feb 24 10:10:16 crc kubenswrapper[4985]: I0224 10:10:16.957770 4985 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 24 10:10:16 crc kubenswrapper[4985]: I0224 10:10:16.957879 4985 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 24 10:10:16 crc kubenswrapper[4985]: I0224 10:10:16.957933 4985 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 24 10:10:16 crc kubenswrapper[4985]: I0224 10:10:16.957955 4985 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 24 10:10:16 crc kubenswrapper[4985]: E0224 10:10:16.958072 4985 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Feb 24 10:10:16 crc kubenswrapper[4985]: E0224 10:10:16.958162 4985 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-24 10:10:20.958147286 +0000 UTC m=+105.432339846 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Feb 24 10:10:16 crc kubenswrapper[4985]: E0224 10:10:16.958239 4985 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 24 10:10:16 crc kubenswrapper[4985]: E0224 10:10:16.958266 4985 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 24 10:10:16 crc kubenswrapper[4985]: E0224 10:10:16.958278 4985 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 24 10:10:16 crc kubenswrapper[4985]: E0224 10:10:16.958281 4985 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 24 10:10:16 crc kubenswrapper[4985]: E0224 10:10:16.958329 4985 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-02-24 10:10:20.95831225 +0000 UTC m=+105.432504810 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 24 10:10:16 crc kubenswrapper[4985]: E0224 10:10:16.958372 4985 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-24 10:10:20.958352341 +0000 UTC m=+105.432544921 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 24 10:10:16 crc kubenswrapper[4985]: E0224 10:10:16.958704 4985 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-24 10:10:20.958670101 +0000 UTC m=+105.432862671 (durationBeforeRetry 4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 24 10:10:16 crc kubenswrapper[4985]: I0224 10:10:16.959002 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:10:16 crc kubenswrapper[4985]: I0224 10:10:16.959031 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:10:16 crc kubenswrapper[4985]: I0224 10:10:16.959041 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:10:16 crc kubenswrapper[4985]: I0224 10:10:16.959054 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 10:10:16 crc kubenswrapper[4985]: I0224 10:10:16.959063 4985 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T10:10:16Z","lastTransitionTime":"2026-02-24T10:10:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 10:10:16 crc kubenswrapper[4985]: I0224 10:10:16.959400 4985 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:16Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:16Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e2f9d78f6a52c55d1f70fa119f84fdb11505d511d394a8ed61a298e79dfb2dcd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:10:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:10:16Z is after 2025-08-24T17:21:41Z" Feb 24 10:10:16 crc kubenswrapper[4985]: I0224 10:10:16.969756 4985 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-xkc65" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d4340d1a-60cb-4240-87ba-1e468c9c41cf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lvcfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lvcfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:10:13Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-xkc65\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:10:16Z is after 2025-08-24T17:21:41Z" Feb 24 10:10:16 crc kubenswrapper[4985]: E0224 10:10:16.972003 4985 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T10:10:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:16Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T10:10:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:16Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T10:10:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:16Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T10:10:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:16Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"6fa96d71-6980-4f45-b02a-32602082091f\\\",\\\"systemUUID\\\":\\\"77848161-2cfb-4f0a-8e60-0ac9426710fd\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:10:16Z is after 2025-08-24T17:21:41Z" Feb 24 10:10:16 crc kubenswrapper[4985]: I0224 10:10:16.975132 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:10:16 crc kubenswrapper[4985]: I0224 10:10:16.975202 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:10:16 crc kubenswrapper[4985]: I0224 10:10:16.975214 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:10:16 crc kubenswrapper[4985]: I0224 10:10:16.975234 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 10:10:16 crc kubenswrapper[4985]: I0224 10:10:16.975245 4985 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T10:10:16Z","lastTransitionTime":"2026-02-24T10:10:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 10:10:16 crc kubenswrapper[4985]: E0224 10:10:16.985809 4985 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T10:10:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:16Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T10:10:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:16Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T10:10:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:16Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T10:10:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:16Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"6fa96d71-6980-4f45-b02a-32602082091f\\\",\\\"systemUUID\\\":\\\"77848161-2cfb-4f0a-8e60-0ac9426710fd\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:10:16Z is after 2025-08-24T17:21:41Z" Feb 24 10:10:16 crc kubenswrapper[4985]: I0224 10:10:16.986442 4985 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-27dpt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1b3986ef-e9be-43db-9350-ccc7dd3f713f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9n6c9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9n6c9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9n6c9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9n6c9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9n6c9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9n6c9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9n6c9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9n6c9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8de70126733b3634f4e3c1576c3095c089e9628a6147ce8ea79fa5242a2eb920\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8de70126733b3634f4e3c1576c3095c089e9628a6147ce8ea79fa5242a2eb920\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T10:10:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9n6c9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:10:13Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-27dpt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:10:16Z is after 2025-08-24T17:21:41Z" Feb 24 10:10:16 crc kubenswrapper[4985]: I0224 10:10:16.988619 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:10:16 crc kubenswrapper[4985]: I0224 10:10:16.988646 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:10:16 crc kubenswrapper[4985]: I0224 10:10:16.988655 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:10:16 crc kubenswrapper[4985]: I0224 10:10:16.988668 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 10:10:16 crc kubenswrapper[4985]: I0224 10:10:16.988676 4985 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T10:10:16Z","lastTransitionTime":"2026-02-24T10:10:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 10:10:16 crc kubenswrapper[4985]: I0224 10:10:16.995962 4985 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-qtqms" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c43e3252-1b22-48a0-8895-ad98fc88b7cf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://778506fd4d9af7b6ebd4e4d46282938146ba0184d9747ca657957c78118c10e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:10:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jdldr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:10:13Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-qtqms\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:10:16Z is after 2025-08-24T17:21:41Z" Feb 24 10:10:16 crc kubenswrapper[4985]: E0224 10:10:16.999007 4985 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T10:10:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:16Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T10:10:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:16Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T10:10:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:16Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T10:10:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:16Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"6fa96d71-6980-4f45-b02a-32602082091f\\\",\\\"systemUUID\\\":\\\"77848161-2cfb-4f0a-8e60-0ac9426710fd\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:10:16Z is after 2025-08-24T17:21:41Z" Feb 24 10:10:17 crc kubenswrapper[4985]: I0224 10:10:17.004791 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:10:17 crc kubenswrapper[4985]: I0224 10:10:17.004826 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:10:17 crc kubenswrapper[4985]: I0224 10:10:17.004835 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:10:17 crc kubenswrapper[4985]: I0224 10:10:17.004849 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 10:10:17 crc kubenswrapper[4985]: I0224 10:10:17.004858 4985 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T10:10:17Z","lastTransitionTime":"2026-02-24T10:10:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 10:10:17 crc kubenswrapper[4985]: I0224 10:10:17.007359 4985 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:10:17Z is after 2025-08-24T17:21:41Z" Feb 24 10:10:17 crc kubenswrapper[4985]: E0224 10:10:17.015652 4985 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T10:10:17Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:17Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T10:10:17Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:17Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T10:10:17Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:17Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T10:10:17Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:17Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"6fa96d71-6980-4f45-b02a-32602082091f\\\",\\\"systemUUID\\\":\\\"77848161-2cfb-4f0a-8e60-0ac9426710fd\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:10:17Z is after 2025-08-24T17:21:41Z" Feb 24 10:10:17 crc kubenswrapper[4985]: I0224 10:10:17.018102 4985 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-q24bf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"731349d2-7b07-4bc9-81f8-c7d75bca842a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://85e965ad144b3440864b3c9de70aacd2a2e49c715dca204b0e4cfae65954cd9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:10:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5g9sh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:10:13Z\\\"}}\" for pod \"openshift-multus\"/\"multus-q24bf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:10:17Z is after 2025-08-24T17:21:41Z" Feb 24 10:10:17 crc kubenswrapper[4985]: I0224 10:10:17.018725 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:10:17 crc kubenswrapper[4985]: I0224 10:10:17.018760 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:10:17 crc kubenswrapper[4985]: I0224 10:10:17.018770 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:10:17 crc kubenswrapper[4985]: I0224 10:10:17.018784 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 10:10:17 crc kubenswrapper[4985]: I0224 10:10:17.018792 4985 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T10:10:17Z","lastTransitionTime":"2026-02-24T10:10:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 10:10:17 crc kubenswrapper[4985]: I0224 10:10:17.031911 4985 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-kkmm8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c3a34c00-910b-400b-bc96-9d805e076b7c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://65682d7b02ccbc0c344ab0b8eaf55d227d0b6e4a04e56d1038d51aa30747b068\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:10:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2zg8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://52f2dbaf1dfa5fd40bbf8a0d82613a96b05afccc71653f6d1f24db6674950b58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:10:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2zg8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:10:13Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-kkmm8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:10:17Z is after 2025-08-24T17:21:41Z" Feb 24 10:10:17 crc kubenswrapper[4985]: E0224 10:10:17.033714 4985 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T10:10:17Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:17Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T10:10:17Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:17Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T10:10:17Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:17Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T10:10:17Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:17Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"6fa96d71-6980-4f45-b02a-32602082091f\\\",\\\"systemUUID\\\":\\\"77848161-2cfb-4f0a-8e60-0ac9426710fd\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:10:17Z is after 2025-08-24T17:21:41Z" Feb 24 10:10:17 crc kubenswrapper[4985]: E0224 10:10:17.033882 4985 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Feb 24 10:10:17 crc kubenswrapper[4985]: I0224 10:10:17.035346 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:10:17 crc kubenswrapper[4985]: I0224 10:10:17.035373 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:10:17 crc kubenswrapper[4985]: I0224 10:10:17.035384 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:10:17 crc kubenswrapper[4985]: I0224 10:10:17.035399 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 10:10:17 crc kubenswrapper[4985]: I0224 10:10:17.035408 4985 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T10:10:17Z","lastTransitionTime":"2026-02-24T10:10:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 10:10:17 crc kubenswrapper[4985]: I0224 10:10:17.042608 4985 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:10:17Z is after 2025-08-24T17:21:41Z" Feb 24 10:10:17 crc kubenswrapper[4985]: I0224 10:10:17.053130 4985 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-hq52w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"11c1c7b8-18df-4583-849f-76b62544344b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8546cf975b145528b7f94bd03c6570de0db1a059477da5795b7569250bde3ca5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:10:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kcfbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://14cb7316040a9cb3cb3236ed16ecc17eee99c6935a646b86604ea8285e1aab0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:10:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kcfbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:10:13Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-hq52w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:10:17Z is after 2025-08-24T17:21:41Z" Feb 24 10:10:17 crc kubenswrapper[4985]: I0224 10:10:17.058713 4985 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d4340d1a-60cb-4240-87ba-1e468c9c41cf-metrics-certs\") pod \"network-metrics-daemon-xkc65\" (UID: \"d4340d1a-60cb-4240-87ba-1e468c9c41cf\") " pod="openshift-multus/network-metrics-daemon-xkc65" Feb 24 10:10:17 crc kubenswrapper[4985]: I0224 10:10:17.058796 4985 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 24 10:10:17 crc kubenswrapper[4985]: E0224 10:10:17.058975 4985 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 24 10:10:17 crc kubenswrapper[4985]: E0224 10:10:17.059007 4985 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 24 10:10:17 crc kubenswrapper[4985]: E0224 10:10:17.059021 4985 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 24 10:10:17 crc kubenswrapper[4985]: E0224 10:10:17.059065 4985 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-02-24 10:10:21.059050368 +0000 UTC m=+105.533242938 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 24 10:10:17 crc kubenswrapper[4985]: E0224 10:10:17.058972 4985 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Feb 24 10:10:17 crc kubenswrapper[4985]: E0224 10:10:17.059153 4985 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d4340d1a-60cb-4240-87ba-1e468c9c41cf-metrics-certs podName:d4340d1a-60cb-4240-87ba-1e468c9c41cf nodeName:}" failed. No retries permitted until 2026-02-24 10:10:21.05912524 +0000 UTC m=+105.533317800 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/d4340d1a-60cb-4240-87ba-1e468c9c41cf-metrics-certs") pod "network-metrics-daemon-xkc65" (UID: "d4340d1a-60cb-4240-87ba-1e468c9c41cf") : object "openshift-multus"/"metrics-daemon-secret" not registered Feb 24 10:10:17 crc kubenswrapper[4985]: I0224 10:10:17.064815 4985 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-jj7jq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4776cd42-9a5c-4c2e-a585-a1f4a49d2d6d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6f45d6400640ba9b59793abc9aee79bb1c32439ff377abdf8b4ec4d1cd7c336a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:10:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rvmf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:10:13Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-jj7jq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:10:17Z is after 2025-08-24T17:21:41Z" Feb 24 10:10:17 crc kubenswrapper[4985]: I0224 10:10:17.081463 4985 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a6395bdd-0d8e-4572-ae86-695e87aff12e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:08:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:08:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:08:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:08:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:08:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f9134e7cde3d6c701f928934ed4de91ee13bac8bc62dde5df6fe535a219fb57\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:08:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba53e3e1dfd92b0155e60254f21e53e06da9929560934d7bd95624f4a2bb619c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:08:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d059ea80117284da8c7eb82e4cd878eabba8f9d96102965708b9f6e0f8e96eb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:08:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dea2225b9a0112117f2f0bd6038ef636ac20e0cfcf3d608b720ad7aa2cacfa7c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6ee8b871ab25b79ee67f4e5735673469d19cfb336696170957d2439b0bca13af\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-24T10:09:32Z\\\",\\\"message\\\":\\\"le observer\\\\nW0224 10:09:31.800132 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0224 10:09:31.800281 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0224 10:09:31.801147 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1923306337/tls.crt::/tmp/serving-cert-1923306337/tls.key\\\\\\\"\\\\nI0224 10:09:31.990905 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0224 10:09:31.993717 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0224 10:09:31.993747 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0224 10:09:31.993768 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0224 10:09:31.993775 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0224 10:09:31.997392 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0224 10:09:31.997421 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0224 10:09:31.997427 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0224 10:09:31.997436 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nI0224 10:09:31.997435 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0224 10:09:31.997440 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0224 10:09:31.997467 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0224 10:09:31.997470 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0224 10:09:31.999573 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-24T10:09:31Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:10:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5cf6a4886c046added0a4380f1ca023def94c15e9d5c0ed472c714632684f158\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:08:39Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8da8e0d5478ae714a89c61cf4ca4142bc433bca4d4e4749f069c706abfed8c8f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8da8e0d5478ae714a89c61cf4ca4142bc433bca4d4e4749f069c706abfed8c8f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T10:08:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T10:08:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:08:36Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:10:17Z is after 2025-08-24T17:21:41Z" Feb 24 10:10:17 crc kubenswrapper[4985]: I0224 10:10:17.096334 4985 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-xj9h5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c4bc690c-38e6-488d-97b2-9bf37a916fe4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"message\\\":\\\"containers with incomplete status: [bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svj5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5aa381e3ea366641ef4fa423fdd46f8afb44b1b2a2ccb6c754007456149b75e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5aa381e3ea366641ef4fa423fdd46f8afb44b1b2a2ccb6c754007456149b75e6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T10:10:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svj5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e071fe863c4e9eb6ebecb3d58d1c4a3c66577cbe46568a4d80a5de5663636539\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e071fe863c4e9eb6ebecb3d58d1c4a3c66577cbe46568a4d80a5de5663636539\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T10:10:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T10:10:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svj5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svj5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svj5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svj5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svj5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:10:13Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-xj9h5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:10:17Z is after 2025-08-24T17:21:41Z" Feb 24 10:10:17 crc kubenswrapper[4985]: I0224 10:10:17.137920 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:10:17 crc kubenswrapper[4985]: I0224 10:10:17.137965 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:10:17 crc kubenswrapper[4985]: I0224 10:10:17.137976 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:10:17 crc kubenswrapper[4985]: I0224 10:10:17.137993 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 10:10:17 crc kubenswrapper[4985]: I0224 10:10:17.138006 4985 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T10:10:17Z","lastTransitionTime":"2026-02-24T10:10:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 10:10:17 crc kubenswrapper[4985]: I0224 10:10:17.240446 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:10:17 crc kubenswrapper[4985]: I0224 10:10:17.240492 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:10:17 crc kubenswrapper[4985]: I0224 10:10:17.240505 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:10:17 crc kubenswrapper[4985]: I0224 10:10:17.240523 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 10:10:17 crc kubenswrapper[4985]: I0224 10:10:17.240534 4985 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T10:10:17Z","lastTransitionTime":"2026-02-24T10:10:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 10:10:17 crc kubenswrapper[4985]: I0224 10:10:17.264484 4985 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xkc65" Feb 24 10:10:17 crc kubenswrapper[4985]: I0224 10:10:17.264506 4985 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 24 10:10:17 crc kubenswrapper[4985]: E0224 10:10:17.264595 4985 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xkc65" podUID="d4340d1a-60cb-4240-87ba-1e468c9c41cf" Feb 24 10:10:17 crc kubenswrapper[4985]: I0224 10:10:17.264605 4985 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 24 10:10:17 crc kubenswrapper[4985]: I0224 10:10:17.264632 4985 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 24 10:10:17 crc kubenswrapper[4985]: E0224 10:10:17.264742 4985 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 24 10:10:17 crc kubenswrapper[4985]: E0224 10:10:17.264858 4985 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 24 10:10:17 crc kubenswrapper[4985]: E0224 10:10:17.264929 4985 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 24 10:10:17 crc kubenswrapper[4985]: I0224 10:10:17.342574 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:10:17 crc kubenswrapper[4985]: I0224 10:10:17.342612 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:10:17 crc kubenswrapper[4985]: I0224 10:10:17.342621 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:10:17 crc kubenswrapper[4985]: I0224 10:10:17.342634 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 10:10:17 crc kubenswrapper[4985]: I0224 10:10:17.342642 4985 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T10:10:17Z","lastTransitionTime":"2026-02-24T10:10:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 10:10:17 crc kubenswrapper[4985]: I0224 10:10:17.445052 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:10:17 crc kubenswrapper[4985]: I0224 10:10:17.445100 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:10:17 crc kubenswrapper[4985]: I0224 10:10:17.445112 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:10:17 crc kubenswrapper[4985]: I0224 10:10:17.445129 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 10:10:17 crc kubenswrapper[4985]: I0224 10:10:17.445144 4985 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T10:10:17Z","lastTransitionTime":"2026-02-24T10:10:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 10:10:17 crc kubenswrapper[4985]: I0224 10:10:17.547720 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:10:17 crc kubenswrapper[4985]: I0224 10:10:17.547806 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:10:17 crc kubenswrapper[4985]: I0224 10:10:17.547827 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:10:17 crc kubenswrapper[4985]: I0224 10:10:17.547851 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 10:10:17 crc kubenswrapper[4985]: I0224 10:10:17.547868 4985 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T10:10:17Z","lastTransitionTime":"2026-02-24T10:10:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 10:10:17 crc kubenswrapper[4985]: I0224 10:10:17.652204 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:10:17 crc kubenswrapper[4985]: I0224 10:10:17.652280 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:10:17 crc kubenswrapper[4985]: I0224 10:10:17.652304 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:10:17 crc kubenswrapper[4985]: I0224 10:10:17.652333 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 10:10:17 crc kubenswrapper[4985]: I0224 10:10:17.652356 4985 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T10:10:17Z","lastTransitionTime":"2026-02-24T10:10:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 10:10:17 crc kubenswrapper[4985]: I0224 10:10:17.718593 4985 generic.go:334] "Generic (PLEG): container finished" podID="c4bc690c-38e6-488d-97b2-9bf37a916fe4" containerID="4ac924a3bb0825085e397dbc788641894675e9556fd8aeadcbf9a24a65e1a076" exitCode=0 Feb 24 10:10:17 crc kubenswrapper[4985]: I0224 10:10:17.718701 4985 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-xj9h5" event={"ID":"c4bc690c-38e6-488d-97b2-9bf37a916fe4","Type":"ContainerDied","Data":"4ac924a3bb0825085e397dbc788641894675e9556fd8aeadcbf9a24a65e1a076"} Feb 24 10:10:17 crc kubenswrapper[4985]: I0224 10:10:17.744825 4985 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c9e0873cd1521fe3cd14005f22118db4616043670f4b895435e71ddec9d82a0e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:10:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:10:17Z is after 2025-08-24T17:21:41Z" Feb 24 10:10:17 crc kubenswrapper[4985]: I0224 10:10:17.755614 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:10:17 crc kubenswrapper[4985]: I0224 10:10:17.755697 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:10:17 crc kubenswrapper[4985]: I0224 10:10:17.755721 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:10:17 crc kubenswrapper[4985]: I0224 10:10:17.755753 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 10:10:17 crc kubenswrapper[4985]: I0224 10:10:17.755778 4985 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T10:10:17Z","lastTransitionTime":"2026-02-24T10:10:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 10:10:17 crc kubenswrapper[4985]: I0224 10:10:17.766963 4985 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:16Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:16Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e2f9d78f6a52c55d1f70fa119f84fdb11505d511d394a8ed61a298e79dfb2dcd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:10:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:10:17Z is after 2025-08-24T17:21:41Z" Feb 24 10:10:17 crc kubenswrapper[4985]: I0224 10:10:17.783721 4985 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-xkc65" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d4340d1a-60cb-4240-87ba-1e468c9c41cf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lvcfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lvcfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:10:13Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-xkc65\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:10:17Z is after 2025-08-24T17:21:41Z" Feb 24 10:10:17 crc kubenswrapper[4985]: I0224 10:10:17.806355 4985 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-27dpt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1b3986ef-e9be-43db-9350-ccc7dd3f713f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9n6c9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9n6c9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9n6c9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9n6c9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9n6c9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9n6c9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9n6c9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9n6c9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8de70126733b3634f4e3c1576c3095c089e9628a6147ce8ea79fa5242a2eb920\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8de70126733b3634f4e3c1576c3095c089e9628a6147ce8ea79fa5242a2eb920\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T10:10:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9n6c9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:10:13Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-27dpt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:10:17Z is after 2025-08-24T17:21:41Z" Feb 24 10:10:17 crc kubenswrapper[4985]: I0224 10:10:17.824960 4985 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-qtqms" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c43e3252-1b22-48a0-8895-ad98fc88b7cf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://778506fd4d9af7b6ebd4e4d46282938146ba0184d9747ca657957c78118c10e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:10:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jdldr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:10:13Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-qtqms\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:10:17Z is after 2025-08-24T17:21:41Z" Feb 24 10:10:17 crc kubenswrapper[4985]: I0224 10:10:17.846017 4985 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-q24bf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"731349d2-7b07-4bc9-81f8-c7d75bca842a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://85e965ad144b3440864b3c9de70aacd2a2e49c715dca204b0e4cfae65954cd9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:10:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5g9sh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:10:13Z\\\"}}\" for pod \"openshift-multus\"/\"multus-q24bf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:10:17Z is after 2025-08-24T17:21:41Z" Feb 24 10:10:17 crc kubenswrapper[4985]: I0224 10:10:17.858628 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:10:17 crc kubenswrapper[4985]: I0224 10:10:17.858703 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:10:17 crc kubenswrapper[4985]: I0224 10:10:17.858726 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:10:17 crc kubenswrapper[4985]: I0224 10:10:17.858757 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 10:10:17 crc kubenswrapper[4985]: I0224 10:10:17.858780 4985 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T10:10:17Z","lastTransitionTime":"2026-02-24T10:10:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 10:10:17 crc kubenswrapper[4985]: I0224 10:10:17.858741 4985 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-kkmm8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c3a34c00-910b-400b-bc96-9d805e076b7c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://65682d7b02ccbc0c344ab0b8eaf55d227d0b6e4a04e56d1038d51aa30747b068\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:10:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2zg8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://52f2dbaf1dfa5fd40bbf8a0d82613a96b05afccc71653f6d1f24db6674950b58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:10:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2zg8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:10:13Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-kkmm8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:10:17Z is after 2025-08-24T17:21:41Z" Feb 24 10:10:17 crc kubenswrapper[4985]: I0224 10:10:17.872270 4985 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:10:17Z is after 2025-08-24T17:21:41Z" Feb 24 10:10:17 crc kubenswrapper[4985]: I0224 10:10:17.885987 4985 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:10:17Z is after 2025-08-24T17:21:41Z" Feb 24 10:10:17 crc kubenswrapper[4985]: I0224 10:10:17.899987 4985 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-jj7jq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4776cd42-9a5c-4c2e-a585-a1f4a49d2d6d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6f45d6400640ba9b59793abc9aee79bb1c32439ff377abdf8b4ec4d1cd7c336a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:10:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rvmf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:10:13Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-jj7jq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:10:17Z is after 2025-08-24T17:21:41Z" Feb 24 10:10:17 crc kubenswrapper[4985]: I0224 10:10:17.928921 4985 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a6395bdd-0d8e-4572-ae86-695e87aff12e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:08:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:08:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:08:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:08:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:08:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f9134e7cde3d6c701f928934ed4de91ee13bac8bc62dde5df6fe535a219fb57\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:08:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba53e3e1dfd92b0155e60254f21e53e06da9929560934d7bd95624f4a2bb619c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:08:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d059ea80117284da8c7eb82e4cd878eabba8f9d96102965708b9f6e0f8e96eb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:08:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dea2225b9a0112117f2f0bd6038ef636ac20e0cfcf3d608b720ad7aa2cacfa7c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6ee8b871ab25b79ee67f4e5735673469d19cfb336696170957d2439b0bca13af\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-24T10:09:32Z\\\",\\\"message\\\":\\\"le observer\\\\nW0224 10:09:31.800132 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0224 10:09:31.800281 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0224 10:09:31.801147 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1923306337/tls.crt::/tmp/serving-cert-1923306337/tls.key\\\\\\\"\\\\nI0224 10:09:31.990905 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0224 10:09:31.993717 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0224 10:09:31.993747 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0224 10:09:31.993768 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0224 10:09:31.993775 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0224 10:09:31.997392 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0224 10:09:31.997421 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0224 10:09:31.997427 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0224 10:09:31.997436 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nI0224 10:09:31.997435 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0224 10:09:31.997440 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0224 10:09:31.997467 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0224 10:09:31.997470 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0224 10:09:31.999573 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-24T10:09:31Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:10:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5cf6a4886c046added0a4380f1ca023def94c15e9d5c0ed472c714632684f158\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:08:39Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8da8e0d5478ae714a89c61cf4ca4142bc433bca4d4e4749f069c706abfed8c8f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8da8e0d5478ae714a89c61cf4ca4142bc433bca4d4e4749f069c706abfed8c8f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T10:08:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T10:08:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:08:36Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:10:17Z is after 2025-08-24T17:21:41Z" Feb 24 10:10:17 crc kubenswrapper[4985]: I0224 10:10:17.945981 4985 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-xj9h5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c4bc690c-38e6-488d-97b2-9bf37a916fe4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"message\\\":\\\"containers with incomplete status: [routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svj5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5aa381e3ea366641ef4fa423fdd46f8afb44b1b2a2ccb6c754007456149b75e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5aa381e3ea366641ef4fa423fdd46f8afb44b1b2a2ccb6c754007456149b75e6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T10:10:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svj5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e071fe863c4e9eb6ebecb3d58d1c4a3c66577cbe46568a4d80a5de5663636539\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e071fe863c4e9eb6ebecb3d58d1c4a3c66577cbe46568a4d80a5de5663636539\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T10:10:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T10:10:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svj5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ac924a3bb0825085e397dbc788641894675e9556fd8aeadcbf9a24a65e1a076\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4ac924a3bb0825085e397dbc788641894675e9556fd8aeadcbf9a24a65e1a076\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T10:10:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T10:10:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svj5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svj5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svj5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svj5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:10:13Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-xj9h5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:10:17Z is after 2025-08-24T17:21:41Z" Feb 24 10:10:17 crc kubenswrapper[4985]: I0224 10:10:17.960231 4985 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-hq52w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"11c1c7b8-18df-4583-849f-76b62544344b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8546cf975b145528b7f94bd03c6570de0db1a059477da5795b7569250bde3ca5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:10:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kcfbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://14cb7316040a9cb3cb3236ed16ecc17eee99c6935a646b86604ea8285e1aab0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:10:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kcfbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:10:13Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-hq52w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:10:17Z is after 2025-08-24T17:21:41Z" Feb 24 10:10:17 crc kubenswrapper[4985]: I0224 10:10:17.961737 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:10:17 crc kubenswrapper[4985]: I0224 10:10:17.961770 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:10:17 crc kubenswrapper[4985]: I0224 10:10:17.961781 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:10:17 crc kubenswrapper[4985]: I0224 10:10:17.961799 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 10:10:17 crc kubenswrapper[4985]: I0224 10:10:17.961810 4985 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T10:10:17Z","lastTransitionTime":"2026-02-24T10:10:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 10:10:17 crc kubenswrapper[4985]: I0224 10:10:17.976967 4985 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:10:17Z is after 2025-08-24T17:21:41Z" Feb 24 10:10:17 crc kubenswrapper[4985]: I0224 10:10:17.992640 4985 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://137c9b42aa2109e16d0e273e3e7d116db30ed716c1e6f78fb8f4e87409e3f252\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:10:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://40736fa6ee59064a7fc5bed9b08671d094e7a11b6a8bf4ec45220a0d62156542\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:10:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:10:17Z is after 2025-08-24T17:21:41Z" Feb 24 10:10:18 crc kubenswrapper[4985]: I0224 10:10:18.064364 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:10:18 crc kubenswrapper[4985]: I0224 10:10:18.064408 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:10:18 crc kubenswrapper[4985]: I0224 10:10:18.064424 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:10:18 crc kubenswrapper[4985]: I0224 10:10:18.064447 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 10:10:18 crc kubenswrapper[4985]: I0224 10:10:18.064463 4985 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T10:10:18Z","lastTransitionTime":"2026-02-24T10:10:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 10:10:18 crc kubenswrapper[4985]: I0224 10:10:18.168103 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:10:18 crc kubenswrapper[4985]: I0224 10:10:18.168147 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:10:18 crc kubenswrapper[4985]: I0224 10:10:18.168166 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:10:18 crc kubenswrapper[4985]: I0224 10:10:18.168190 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 10:10:18 crc kubenswrapper[4985]: I0224 10:10:18.168207 4985 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T10:10:18Z","lastTransitionTime":"2026-02-24T10:10:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 10:10:18 crc kubenswrapper[4985]: I0224 10:10:18.270574 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:10:18 crc kubenswrapper[4985]: I0224 10:10:18.270645 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:10:18 crc kubenswrapper[4985]: I0224 10:10:18.270666 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:10:18 crc kubenswrapper[4985]: I0224 10:10:18.270696 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 10:10:18 crc kubenswrapper[4985]: I0224 10:10:18.270717 4985 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T10:10:18Z","lastTransitionTime":"2026-02-24T10:10:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 10:10:18 crc kubenswrapper[4985]: I0224 10:10:18.373142 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:10:18 crc kubenswrapper[4985]: I0224 10:10:18.373205 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:10:18 crc kubenswrapper[4985]: I0224 10:10:18.373232 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:10:18 crc kubenswrapper[4985]: I0224 10:10:18.373262 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 10:10:18 crc kubenswrapper[4985]: I0224 10:10:18.373282 4985 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T10:10:18Z","lastTransitionTime":"2026-02-24T10:10:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 10:10:18 crc kubenswrapper[4985]: I0224 10:10:18.476640 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:10:18 crc kubenswrapper[4985]: I0224 10:10:18.476693 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:10:18 crc kubenswrapper[4985]: I0224 10:10:18.476705 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:10:18 crc kubenswrapper[4985]: I0224 10:10:18.476722 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 10:10:18 crc kubenswrapper[4985]: I0224 10:10:18.476734 4985 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T10:10:18Z","lastTransitionTime":"2026-02-24T10:10:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 10:10:18 crc kubenswrapper[4985]: I0224 10:10:18.579522 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:10:18 crc kubenswrapper[4985]: I0224 10:10:18.579567 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:10:18 crc kubenswrapper[4985]: I0224 10:10:18.579581 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:10:18 crc kubenswrapper[4985]: I0224 10:10:18.579595 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 10:10:18 crc kubenswrapper[4985]: I0224 10:10:18.579605 4985 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T10:10:18Z","lastTransitionTime":"2026-02-24T10:10:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 10:10:18 crc kubenswrapper[4985]: I0224 10:10:18.682029 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:10:18 crc kubenswrapper[4985]: I0224 10:10:18.682082 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:10:18 crc kubenswrapper[4985]: I0224 10:10:18.682097 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:10:18 crc kubenswrapper[4985]: I0224 10:10:18.682112 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 10:10:18 crc kubenswrapper[4985]: I0224 10:10:18.682123 4985 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T10:10:18Z","lastTransitionTime":"2026-02-24T10:10:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 10:10:18 crc kubenswrapper[4985]: I0224 10:10:18.726272 4985 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-27dpt" event={"ID":"1b3986ef-e9be-43db-9350-ccc7dd3f713f","Type":"ContainerStarted","Data":"e3faa6c1361aca1d102b6dbfaf38fa3094a5c2334713ce70a8bf2f58f19d6392"} Feb 24 10:10:18 crc kubenswrapper[4985]: I0224 10:10:18.729135 4985 generic.go:334] "Generic (PLEG): container finished" podID="c4bc690c-38e6-488d-97b2-9bf37a916fe4" containerID="91acce3dbd19205b8faf23208b7d65c0c47d8156eaf6d7d3357ce1ca64a3bb88" exitCode=0 Feb 24 10:10:18 crc kubenswrapper[4985]: I0224 10:10:18.729183 4985 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-xj9h5" event={"ID":"c4bc690c-38e6-488d-97b2-9bf37a916fe4","Type":"ContainerDied","Data":"91acce3dbd19205b8faf23208b7d65c0c47d8156eaf6d7d3357ce1ca64a3bb88"} Feb 24 10:10:18 crc kubenswrapper[4985]: I0224 10:10:18.749155 4985 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c9e0873cd1521fe3cd14005f22118db4616043670f4b895435e71ddec9d82a0e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:10:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:10:18Z is after 2025-08-24T17:21:41Z" Feb 24 10:10:18 crc kubenswrapper[4985]: I0224 10:10:18.761005 4985 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:16Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:16Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e2f9d78f6a52c55d1f70fa119f84fdb11505d511d394a8ed61a298e79dfb2dcd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:10:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:10:18Z is after 2025-08-24T17:21:41Z" Feb 24 10:10:18 crc kubenswrapper[4985]: I0224 10:10:18.770986 4985 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-xkc65" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d4340d1a-60cb-4240-87ba-1e468c9c41cf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lvcfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lvcfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:10:13Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-xkc65\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:10:18Z is after 2025-08-24T17:21:41Z" Feb 24 10:10:18 crc kubenswrapper[4985]: I0224 10:10:18.788546 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:10:18 crc kubenswrapper[4985]: I0224 10:10:18.788627 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:10:18 crc kubenswrapper[4985]: I0224 10:10:18.788650 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:10:18 crc kubenswrapper[4985]: I0224 10:10:18.788687 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 10:10:18 crc kubenswrapper[4985]: I0224 10:10:18.788708 4985 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T10:10:18Z","lastTransitionTime":"2026-02-24T10:10:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 10:10:18 crc kubenswrapper[4985]: I0224 10:10:18.793090 4985 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-27dpt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1b3986ef-e9be-43db-9350-ccc7dd3f713f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9n6c9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9n6c9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9n6c9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9n6c9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9n6c9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9n6c9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9n6c9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9n6c9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8de70126733b3634f4e3c1576c3095c089e9628a6147ce8ea79fa5242a2eb920\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8de70126733b3634f4e3c1576c3095c089e9628a6147ce8ea79fa5242a2eb920\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T10:10:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9n6c9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:10:13Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-27dpt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:10:18Z is after 2025-08-24T17:21:41Z" Feb 24 10:10:18 crc kubenswrapper[4985]: I0224 10:10:18.806461 4985 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-qtqms" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c43e3252-1b22-48a0-8895-ad98fc88b7cf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://778506fd4d9af7b6ebd4e4d46282938146ba0184d9747ca657957c78118c10e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:10:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jdldr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:10:13Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-qtqms\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:10:18Z is after 2025-08-24T17:21:41Z" Feb 24 10:10:18 crc kubenswrapper[4985]: I0224 10:10:18.823880 4985 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-q24bf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"731349d2-7b07-4bc9-81f8-c7d75bca842a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://85e965ad144b3440864b3c9de70aacd2a2e49c715dca204b0e4cfae65954cd9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:10:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5g9sh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:10:13Z\\\"}}\" for pod \"openshift-multus\"/\"multus-q24bf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:10:18Z is after 2025-08-24T17:21:41Z" Feb 24 10:10:18 crc kubenswrapper[4985]: I0224 10:10:18.838944 4985 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-kkmm8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c3a34c00-910b-400b-bc96-9d805e076b7c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://65682d7b02ccbc0c344ab0b8eaf55d227d0b6e4a04e56d1038d51aa30747b068\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:10:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2zg8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://52f2dbaf1dfa5fd40bbf8a0d82613a96b05afccc71653f6d1f24db6674950b58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:10:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2zg8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:10:13Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-kkmm8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:10:18Z is after 2025-08-24T17:21:41Z" Feb 24 10:10:18 crc kubenswrapper[4985]: I0224 10:10:18.852619 4985 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:10:18Z is after 2025-08-24T17:21:41Z" Feb 24 10:10:18 crc kubenswrapper[4985]: I0224 10:10:18.865987 4985 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:10:18Z is after 2025-08-24T17:21:41Z" Feb 24 10:10:18 crc kubenswrapper[4985]: I0224 10:10:18.875619 4985 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-jj7jq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4776cd42-9a5c-4c2e-a585-a1f4a49d2d6d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6f45d6400640ba9b59793abc9aee79bb1c32439ff377abdf8b4ec4d1cd7c336a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:10:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rvmf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:10:13Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-jj7jq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:10:18Z is after 2025-08-24T17:21:41Z" Feb 24 10:10:18 crc kubenswrapper[4985]: I0224 10:10:18.887382 4985 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a6395bdd-0d8e-4572-ae86-695e87aff12e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:08:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:08:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:08:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:08:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:08:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f9134e7cde3d6c701f928934ed4de91ee13bac8bc62dde5df6fe535a219fb57\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:08:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba53e3e1dfd92b0155e60254f21e53e06da9929560934d7bd95624f4a2bb619c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:08:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d059ea80117284da8c7eb82e4cd878eabba8f9d96102965708b9f6e0f8e96eb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:08:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dea2225b9a0112117f2f0bd6038ef636ac20e0cfcf3d608b720ad7aa2cacfa7c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6ee8b871ab25b79ee67f4e5735673469d19cfb336696170957d2439b0bca13af\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-24T10:09:32Z\\\",\\\"message\\\":\\\"le observer\\\\nW0224 10:09:31.800132 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0224 10:09:31.800281 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0224 10:09:31.801147 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1923306337/tls.crt::/tmp/serving-cert-1923306337/tls.key\\\\\\\"\\\\nI0224 10:09:31.990905 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0224 10:09:31.993717 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0224 10:09:31.993747 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0224 10:09:31.993768 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0224 10:09:31.993775 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0224 10:09:31.997392 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0224 10:09:31.997421 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0224 10:09:31.997427 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0224 10:09:31.997436 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nI0224 10:09:31.997435 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0224 10:09:31.997440 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0224 10:09:31.997467 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0224 10:09:31.997470 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0224 10:09:31.999573 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-24T10:09:31Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:10:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5cf6a4886c046added0a4380f1ca023def94c15e9d5c0ed472c714632684f158\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:08:39Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8da8e0d5478ae714a89c61cf4ca4142bc433bca4d4e4749f069c706abfed8c8f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8da8e0d5478ae714a89c61cf4ca4142bc433bca4d4e4749f069c706abfed8c8f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T10:08:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T10:08:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:08:36Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:10:18Z is after 2025-08-24T17:21:41Z" Feb 24 10:10:18 crc kubenswrapper[4985]: I0224 10:10:18.890778 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:10:18 crc kubenswrapper[4985]: I0224 10:10:18.890824 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:10:18 crc kubenswrapper[4985]: I0224 10:10:18.890842 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:10:18 crc kubenswrapper[4985]: I0224 10:10:18.890868 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 10:10:18 crc kubenswrapper[4985]: I0224 10:10:18.890907 4985 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T10:10:18Z","lastTransitionTime":"2026-02-24T10:10:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 10:10:18 crc kubenswrapper[4985]: I0224 10:10:18.905430 4985 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-xj9h5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c4bc690c-38e6-488d-97b2-9bf37a916fe4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svj5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5aa381e3ea366641ef4fa423fdd46f8afb44b1b2a2ccb6c754007456149b75e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5aa381e3ea366641ef4fa423fdd46f8afb44b1b2a2ccb6c754007456149b75e6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T10:10:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svj5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e071fe863c4e9eb6ebecb3d58d1c4a3c66577cbe46568a4d80a5de5663636539\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e071fe863c4e9eb6ebecb3d58d1c4a3c66577cbe46568a4d80a5de5663636539\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T10:10:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T10:10:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svj5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ac924a3bb0825085e397dbc788641894675e9556fd8aeadcbf9a24a65e1a076\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4ac924a3bb0825085e397dbc788641894675e9556fd8aeadcbf9a24a65e1a076\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T10:10:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T10:10:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svj5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://91acce3dbd19205b8faf23208b7d65c0c47d8156eaf6d7d3357ce1ca64a3bb88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://91acce3dbd19205b8faf23208b7d65c0c47d8156eaf6d7d3357ce1ca64a3bb88\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T10:10:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T10:10:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svj5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svj5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svj5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:10:13Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-xj9h5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:10:18Z is after 2025-08-24T17:21:41Z" Feb 24 10:10:18 crc kubenswrapper[4985]: I0224 10:10:18.918531 4985 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-hq52w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"11c1c7b8-18df-4583-849f-76b62544344b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8546cf975b145528b7f94bd03c6570de0db1a059477da5795b7569250bde3ca5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:10:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kcfbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://14cb7316040a9cb3cb3236ed16ecc17eee99c6935a646b86604ea8285e1aab0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:10:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kcfbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:10:13Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-hq52w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:10:18Z is after 2025-08-24T17:21:41Z" Feb 24 10:10:18 crc kubenswrapper[4985]: I0224 10:10:18.930268 4985 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:10:18Z is after 2025-08-24T17:21:41Z" Feb 24 10:10:18 crc kubenswrapper[4985]: I0224 10:10:18.944948 4985 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://137c9b42aa2109e16d0e273e3e7d116db30ed716c1e6f78fb8f4e87409e3f252\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:10:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://40736fa6ee59064a7fc5bed9b08671d094e7a11b6a8bf4ec45220a0d62156542\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:10:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:10:18Z is after 2025-08-24T17:21:41Z" Feb 24 10:10:18 crc kubenswrapper[4985]: I0224 10:10:18.992661 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:10:18 crc kubenswrapper[4985]: I0224 10:10:18.992700 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:10:18 crc kubenswrapper[4985]: I0224 10:10:18.992712 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:10:18 crc kubenswrapper[4985]: I0224 10:10:18.992728 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 10:10:18 crc kubenswrapper[4985]: I0224 10:10:18.992739 4985 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T10:10:18Z","lastTransitionTime":"2026-02-24T10:10:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 10:10:19 crc kubenswrapper[4985]: I0224 10:10:19.095597 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:10:19 crc kubenswrapper[4985]: I0224 10:10:19.095638 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:10:19 crc kubenswrapper[4985]: I0224 10:10:19.095651 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:10:19 crc kubenswrapper[4985]: I0224 10:10:19.095668 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 10:10:19 crc kubenswrapper[4985]: I0224 10:10:19.095681 4985 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T10:10:19Z","lastTransitionTime":"2026-02-24T10:10:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 10:10:19 crc kubenswrapper[4985]: I0224 10:10:19.198054 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:10:19 crc kubenswrapper[4985]: I0224 10:10:19.198096 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:10:19 crc kubenswrapper[4985]: I0224 10:10:19.198109 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:10:19 crc kubenswrapper[4985]: I0224 10:10:19.198126 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 10:10:19 crc kubenswrapper[4985]: I0224 10:10:19.198136 4985 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T10:10:19Z","lastTransitionTime":"2026-02-24T10:10:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 10:10:19 crc kubenswrapper[4985]: I0224 10:10:19.263957 4985 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 24 10:10:19 crc kubenswrapper[4985]: I0224 10:10:19.263998 4985 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xkc65" Feb 24 10:10:19 crc kubenswrapper[4985]: I0224 10:10:19.264033 4985 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 24 10:10:19 crc kubenswrapper[4985]: I0224 10:10:19.264036 4985 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 24 10:10:19 crc kubenswrapper[4985]: E0224 10:10:19.264124 4985 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 24 10:10:19 crc kubenswrapper[4985]: E0224 10:10:19.264389 4985 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 24 10:10:19 crc kubenswrapper[4985]: E0224 10:10:19.264459 4985 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 24 10:10:19 crc kubenswrapper[4985]: E0224 10:10:19.264657 4985 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xkc65" podUID="d4340d1a-60cb-4240-87ba-1e468c9c41cf" Feb 24 10:10:19 crc kubenswrapper[4985]: I0224 10:10:19.302280 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:10:19 crc kubenswrapper[4985]: I0224 10:10:19.302334 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:10:19 crc kubenswrapper[4985]: I0224 10:10:19.302347 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:10:19 crc kubenswrapper[4985]: I0224 10:10:19.302367 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 10:10:19 crc kubenswrapper[4985]: I0224 10:10:19.302381 4985 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T10:10:19Z","lastTransitionTime":"2026-02-24T10:10:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 10:10:19 crc kubenswrapper[4985]: I0224 10:10:19.405523 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:10:19 crc kubenswrapper[4985]: I0224 10:10:19.405601 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:10:19 crc kubenswrapper[4985]: I0224 10:10:19.405620 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:10:19 crc kubenswrapper[4985]: I0224 10:10:19.405649 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 10:10:19 crc kubenswrapper[4985]: I0224 10:10:19.405668 4985 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T10:10:19Z","lastTransitionTime":"2026-02-24T10:10:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 10:10:19 crc kubenswrapper[4985]: I0224 10:10:19.508576 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:10:19 crc kubenswrapper[4985]: I0224 10:10:19.508617 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:10:19 crc kubenswrapper[4985]: I0224 10:10:19.508626 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:10:19 crc kubenswrapper[4985]: I0224 10:10:19.508642 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 10:10:19 crc kubenswrapper[4985]: I0224 10:10:19.508652 4985 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T10:10:19Z","lastTransitionTime":"2026-02-24T10:10:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 10:10:19 crc kubenswrapper[4985]: I0224 10:10:19.612121 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:10:19 crc kubenswrapper[4985]: I0224 10:10:19.612181 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:10:19 crc kubenswrapper[4985]: I0224 10:10:19.612198 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:10:19 crc kubenswrapper[4985]: I0224 10:10:19.612222 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 10:10:19 crc kubenswrapper[4985]: I0224 10:10:19.612239 4985 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T10:10:19Z","lastTransitionTime":"2026-02-24T10:10:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 10:10:19 crc kubenswrapper[4985]: I0224 10:10:19.715054 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:10:19 crc kubenswrapper[4985]: I0224 10:10:19.715092 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:10:19 crc kubenswrapper[4985]: I0224 10:10:19.715104 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:10:19 crc kubenswrapper[4985]: I0224 10:10:19.715123 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 10:10:19 crc kubenswrapper[4985]: I0224 10:10:19.715135 4985 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T10:10:19Z","lastTransitionTime":"2026-02-24T10:10:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 10:10:19 crc kubenswrapper[4985]: I0224 10:10:19.737255 4985 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-xj9h5" event={"ID":"c4bc690c-38e6-488d-97b2-9bf37a916fe4","Type":"ContainerStarted","Data":"81482396eb82b6ef2f09d03cf5e6bc785209394a50c63b199581fc70120bb9fd"} Feb 24 10:10:19 crc kubenswrapper[4985]: I0224 10:10:19.751650 4985 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-jj7jq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4776cd42-9a5c-4c2e-a585-a1f4a49d2d6d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6f45d6400640ba9b59793abc9aee79bb1c32439ff377abdf8b4ec4d1cd7c336a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:10:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rvmf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:10:13Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-jj7jq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:10:19Z is after 2025-08-24T17:21:41Z" Feb 24 10:10:19 crc kubenswrapper[4985]: I0224 10:10:19.771473 4985 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a6395bdd-0d8e-4572-ae86-695e87aff12e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:08:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:08:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:08:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:08:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:08:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f9134e7cde3d6c701f928934ed4de91ee13bac8bc62dde5df6fe535a219fb57\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:08:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba53e3e1dfd92b0155e60254f21e53e06da9929560934d7bd95624f4a2bb619c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:08:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d059ea80117284da8c7eb82e4cd878eabba8f9d96102965708b9f6e0f8e96eb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:08:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dea2225b9a0112117f2f0bd6038ef636ac20e0cfcf3d608b720ad7aa2cacfa7c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6ee8b871ab25b79ee67f4e5735673469d19cfb336696170957d2439b0bca13af\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-24T10:09:32Z\\\",\\\"message\\\":\\\"le observer\\\\nW0224 10:09:31.800132 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0224 10:09:31.800281 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0224 10:09:31.801147 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1923306337/tls.crt::/tmp/serving-cert-1923306337/tls.key\\\\\\\"\\\\nI0224 10:09:31.990905 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0224 10:09:31.993717 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0224 10:09:31.993747 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0224 10:09:31.993768 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0224 10:09:31.993775 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0224 10:09:31.997392 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0224 10:09:31.997421 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0224 10:09:31.997427 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0224 10:09:31.997436 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nI0224 10:09:31.997435 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0224 10:09:31.997440 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0224 10:09:31.997467 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0224 10:09:31.997470 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0224 10:09:31.999573 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-24T10:09:31Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:10:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5cf6a4886c046added0a4380f1ca023def94c15e9d5c0ed472c714632684f158\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:08:39Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8da8e0d5478ae714a89c61cf4ca4142bc433bca4d4e4749f069c706abfed8c8f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8da8e0d5478ae714a89c61cf4ca4142bc433bca4d4e4749f069c706abfed8c8f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T10:08:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T10:08:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:08:36Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:10:19Z is after 2025-08-24T17:21:41Z" Feb 24 10:10:19 crc kubenswrapper[4985]: I0224 10:10:19.794239 4985 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-xj9h5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c4bc690c-38e6-488d-97b2-9bf37a916fe4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svj5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5aa381e3ea366641ef4fa423fdd46f8afb44b1b2a2ccb6c754007456149b75e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5aa381e3ea366641ef4fa423fdd46f8afb44b1b2a2ccb6c754007456149b75e6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T10:10:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svj5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e071fe863c4e9eb6ebecb3d58d1c4a3c66577cbe46568a4d80a5de5663636539\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e071fe863c4e9eb6ebecb3d58d1c4a3c66577cbe46568a4d80a5de5663636539\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T10:10:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T10:10:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svj5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ac924a3bb0825085e397dbc788641894675e9556fd8aeadcbf9a24a65e1a076\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4ac924a3bb0825085e397dbc788641894675e9556fd8aeadcbf9a24a65e1a076\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T10:10:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T10:10:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svj5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://91acce3dbd19205b8faf23208b7d65c0c47d8156eaf6d7d3357ce1ca64a3bb88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://91acce3dbd19205b8faf23208b7d65c0c47d8156eaf6d7d3357ce1ca64a3bb88\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T10:10:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T10:10:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svj5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://81482396eb82b6ef2f09d03cf5e6bc785209394a50c63b199581fc70120bb9fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:10:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svj5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svj5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:10:13Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-xj9h5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:10:19Z is after 2025-08-24T17:21:41Z" Feb 24 10:10:19 crc kubenswrapper[4985]: I0224 10:10:19.808735 4985 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-hq52w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"11c1c7b8-18df-4583-849f-76b62544344b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8546cf975b145528b7f94bd03c6570de0db1a059477da5795b7569250bde3ca5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:10:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kcfbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://14cb7316040a9cb3cb3236ed16ecc17eee99c6935a646b86604ea8285e1aab0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:10:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kcfbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:10:13Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-hq52w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:10:19Z is after 2025-08-24T17:21:41Z" Feb 24 10:10:19 crc kubenswrapper[4985]: I0224 10:10:19.817215 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:10:19 crc kubenswrapper[4985]: I0224 10:10:19.817252 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:10:19 crc kubenswrapper[4985]: I0224 10:10:19.817264 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:10:19 crc kubenswrapper[4985]: I0224 10:10:19.817278 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 10:10:19 crc kubenswrapper[4985]: I0224 10:10:19.817289 4985 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T10:10:19Z","lastTransitionTime":"2026-02-24T10:10:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 10:10:19 crc kubenswrapper[4985]: I0224 10:10:19.827356 4985 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:10:19Z is after 2025-08-24T17:21:41Z" Feb 24 10:10:19 crc kubenswrapper[4985]: I0224 10:10:19.844370 4985 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://137c9b42aa2109e16d0e273e3e7d116db30ed716c1e6f78fb8f4e87409e3f252\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:10:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://40736fa6ee59064a7fc5bed9b08671d094e7a11b6a8bf4ec45220a0d62156542\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:10:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:10:19Z is after 2025-08-24T17:21:41Z" Feb 24 10:10:19 crc kubenswrapper[4985]: I0224 10:10:19.864167 4985 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c9e0873cd1521fe3cd14005f22118db4616043670f4b895435e71ddec9d82a0e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:10:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:10:19Z is after 2025-08-24T17:21:41Z" Feb 24 10:10:19 crc kubenswrapper[4985]: I0224 10:10:19.882106 4985 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:16Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:16Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e2f9d78f6a52c55d1f70fa119f84fdb11505d511d394a8ed61a298e79dfb2dcd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:10:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:10:19Z is after 2025-08-24T17:21:41Z" Feb 24 10:10:19 crc kubenswrapper[4985]: I0224 10:10:19.896502 4985 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-xkc65" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d4340d1a-60cb-4240-87ba-1e468c9c41cf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lvcfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lvcfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:10:13Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-xkc65\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:10:19Z is after 2025-08-24T17:21:41Z" Feb 24 10:10:19 crc kubenswrapper[4985]: I0224 10:10:19.918357 4985 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-27dpt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1b3986ef-e9be-43db-9350-ccc7dd3f713f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9n6c9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9n6c9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9n6c9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9n6c9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9n6c9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9n6c9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9n6c9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9n6c9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8de70126733b3634f4e3c1576c3095c089e9628a6147ce8ea79fa5242a2eb920\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8de70126733b3634f4e3c1576c3095c089e9628a6147ce8ea79fa5242a2eb920\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T10:10:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9n6c9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:10:13Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-27dpt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:10:19Z is after 2025-08-24T17:21:41Z" Feb 24 10:10:19 crc kubenswrapper[4985]: I0224 10:10:19.919423 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:10:19 crc kubenswrapper[4985]: I0224 10:10:19.919461 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:10:19 crc kubenswrapper[4985]: I0224 10:10:19.919473 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:10:19 crc kubenswrapper[4985]: I0224 10:10:19.919490 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 10:10:19 crc kubenswrapper[4985]: I0224 10:10:19.919502 4985 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T10:10:19Z","lastTransitionTime":"2026-02-24T10:10:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 10:10:19 crc kubenswrapper[4985]: I0224 10:10:19.930336 4985 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-qtqms" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c43e3252-1b22-48a0-8895-ad98fc88b7cf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://778506fd4d9af7b6ebd4e4d46282938146ba0184d9747ca657957c78118c10e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:10:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jdldr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:10:13Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-qtqms\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:10:19Z is after 2025-08-24T17:21:41Z" Feb 24 10:10:19 crc kubenswrapper[4985]: I0224 10:10:19.943876 4985 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-q24bf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"731349d2-7b07-4bc9-81f8-c7d75bca842a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://85e965ad144b3440864b3c9de70aacd2a2e49c715dca204b0e4cfae65954cd9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:10:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5g9sh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:10:13Z\\\"}}\" for pod \"openshift-multus\"/\"multus-q24bf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:10:19Z is after 2025-08-24T17:21:41Z" Feb 24 10:10:19 crc kubenswrapper[4985]: I0224 10:10:19.956773 4985 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-kkmm8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c3a34c00-910b-400b-bc96-9d805e076b7c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://65682d7b02ccbc0c344ab0b8eaf55d227d0b6e4a04e56d1038d51aa30747b068\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:10:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2zg8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://52f2dbaf1dfa5fd40bbf8a0d82613a96b05afccc71653f6d1f24db6674950b58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:10:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2zg8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:10:13Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-kkmm8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:10:19Z is after 2025-08-24T17:21:41Z" Feb 24 10:10:19 crc kubenswrapper[4985]: I0224 10:10:19.971668 4985 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:10:19Z is after 2025-08-24T17:21:41Z" Feb 24 10:10:19 crc kubenswrapper[4985]: I0224 10:10:19.986249 4985 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:10:19Z is after 2025-08-24T17:21:41Z" Feb 24 10:10:20 crc kubenswrapper[4985]: I0224 10:10:20.021802 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:10:20 crc kubenswrapper[4985]: I0224 10:10:20.021840 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:10:20 crc kubenswrapper[4985]: I0224 10:10:20.021860 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:10:20 crc kubenswrapper[4985]: I0224 10:10:20.021900 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 10:10:20 crc kubenswrapper[4985]: I0224 10:10:20.021913 4985 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T10:10:20Z","lastTransitionTime":"2026-02-24T10:10:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 10:10:20 crc kubenswrapper[4985]: I0224 10:10:20.124038 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:10:20 crc kubenswrapper[4985]: I0224 10:10:20.124083 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:10:20 crc kubenswrapper[4985]: I0224 10:10:20.124098 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:10:20 crc kubenswrapper[4985]: I0224 10:10:20.124120 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 10:10:20 crc kubenswrapper[4985]: I0224 10:10:20.124133 4985 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T10:10:20Z","lastTransitionTime":"2026-02-24T10:10:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 10:10:20 crc kubenswrapper[4985]: I0224 10:10:20.227833 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:10:20 crc kubenswrapper[4985]: I0224 10:10:20.227908 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:10:20 crc kubenswrapper[4985]: I0224 10:10:20.227920 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:10:20 crc kubenswrapper[4985]: I0224 10:10:20.227948 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 10:10:20 crc kubenswrapper[4985]: I0224 10:10:20.227967 4985 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T10:10:20Z","lastTransitionTime":"2026-02-24T10:10:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 10:10:20 crc kubenswrapper[4985]: I0224 10:10:20.330531 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:10:20 crc kubenswrapper[4985]: I0224 10:10:20.330578 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:10:20 crc kubenswrapper[4985]: I0224 10:10:20.330590 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:10:20 crc kubenswrapper[4985]: I0224 10:10:20.330609 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 10:10:20 crc kubenswrapper[4985]: I0224 10:10:20.330623 4985 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T10:10:20Z","lastTransitionTime":"2026-02-24T10:10:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 10:10:20 crc kubenswrapper[4985]: I0224 10:10:20.432545 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:10:20 crc kubenswrapper[4985]: I0224 10:10:20.432580 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:10:20 crc kubenswrapper[4985]: I0224 10:10:20.432594 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:10:20 crc kubenswrapper[4985]: I0224 10:10:20.432609 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 10:10:20 crc kubenswrapper[4985]: I0224 10:10:20.432620 4985 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T10:10:20Z","lastTransitionTime":"2026-02-24T10:10:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 10:10:20 crc kubenswrapper[4985]: I0224 10:10:20.535344 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:10:20 crc kubenswrapper[4985]: I0224 10:10:20.535415 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:10:20 crc kubenswrapper[4985]: I0224 10:10:20.535458 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:10:20 crc kubenswrapper[4985]: I0224 10:10:20.535489 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 10:10:20 crc kubenswrapper[4985]: I0224 10:10:20.535514 4985 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T10:10:20Z","lastTransitionTime":"2026-02-24T10:10:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 10:10:20 crc kubenswrapper[4985]: I0224 10:10:20.638741 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:10:20 crc kubenswrapper[4985]: I0224 10:10:20.638783 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:10:20 crc kubenswrapper[4985]: I0224 10:10:20.638792 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:10:20 crc kubenswrapper[4985]: I0224 10:10:20.638806 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 10:10:20 crc kubenswrapper[4985]: I0224 10:10:20.638816 4985 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T10:10:20Z","lastTransitionTime":"2026-02-24T10:10:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 10:10:20 crc kubenswrapper[4985]: I0224 10:10:20.742078 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:10:20 crc kubenswrapper[4985]: I0224 10:10:20.742575 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:10:20 crc kubenswrapper[4985]: I0224 10:10:20.742598 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:10:20 crc kubenswrapper[4985]: I0224 10:10:20.742632 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 10:10:20 crc kubenswrapper[4985]: I0224 10:10:20.742655 4985 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T10:10:20Z","lastTransitionTime":"2026-02-24T10:10:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 10:10:20 crc kubenswrapper[4985]: I0224 10:10:20.750926 4985 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-27dpt" event={"ID":"1b3986ef-e9be-43db-9350-ccc7dd3f713f","Type":"ContainerStarted","Data":"59cf50a579cdd65a2981d9c24099f85b6f8708295e48f9ce8e81c9b2999e6189"} Feb 24 10:10:20 crc kubenswrapper[4985]: I0224 10:10:20.751327 4985 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-27dpt" Feb 24 10:10:20 crc kubenswrapper[4985]: I0224 10:10:20.751400 4985 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-27dpt" Feb 24 10:10:20 crc kubenswrapper[4985]: I0224 10:10:20.751429 4985 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-27dpt" Feb 24 10:10:20 crc kubenswrapper[4985]: I0224 10:10:20.755447 4985 generic.go:334] "Generic (PLEG): container finished" podID="c4bc690c-38e6-488d-97b2-9bf37a916fe4" containerID="81482396eb82b6ef2f09d03cf5e6bc785209394a50c63b199581fc70120bb9fd" exitCode=0 Feb 24 10:10:20 crc kubenswrapper[4985]: I0224 10:10:20.755537 4985 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-xj9h5" event={"ID":"c4bc690c-38e6-488d-97b2-9bf37a916fe4","Type":"ContainerDied","Data":"81482396eb82b6ef2f09d03cf5e6bc785209394a50c63b199581fc70120bb9fd"} Feb 24 10:10:20 crc kubenswrapper[4985]: I0224 10:10:20.786648 4985 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-27dpt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1b3986ef-e9be-43db-9350-ccc7dd3f713f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://97f91c2ddd00ceb58ea5691c7cb057bd7cfe8add5fd1dfd334f39b9def30db0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:10:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9n6c9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://af211a7b5dbd1e1f86f340d63afe03ad2f9691d166810837cc5988a24ad4f323\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:10:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9n6c9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3d76cc1e57df7809a67ce69b6690fd8cbe0ef9c364406b03bbb37bd110d65866\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:10:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9n6c9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cb7b5b1f27320e4bc00e48154c3bb41b1153e36c39a404cfc1d4a246f5909c5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:10:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9n6c9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e04b766ff2b4eb366ace7cbf978ef84bfa922494ec3d28f4cbd75efe1cb54f1e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:10:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9n6c9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7630d2e046fcee78924010a8de1aefcdf7c24ef4e70c4ea89c193983a0d88a61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:10:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9n6c9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://59cf50a579cdd65a2981d9c24099f85b6f8708295e48f9ce8e81c9b2999e6189\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:10:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9n6c9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e3faa6c1361aca1d102b6dbfaf38fa3094a5c2334713ce70a8bf2f58f19d6392\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:10:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9n6c9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8de70126733b3634f4e3c1576c3095c089e9628a6147ce8ea79fa5242a2eb920\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8de70126733b3634f4e3c1576c3095c089e9628a6147ce8ea79fa5242a2eb920\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T10:10:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9n6c9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:10:13Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-27dpt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:10:20Z is after 2025-08-24T17:21:41Z" Feb 24 10:10:20 crc kubenswrapper[4985]: I0224 10:10:20.786988 4985 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-27dpt" Feb 24 10:10:20 crc kubenswrapper[4985]: I0224 10:10:20.788490 4985 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-27dpt" Feb 24 10:10:20 crc kubenswrapper[4985]: I0224 10:10:20.801540 4985 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-qtqms" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c43e3252-1b22-48a0-8895-ad98fc88b7cf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://778506fd4d9af7b6ebd4e4d46282938146ba0184d9747ca657957c78118c10e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:10:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jdldr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:10:13Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-qtqms\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:10:20Z is after 2025-08-24T17:21:41Z" Feb 24 10:10:20 crc kubenswrapper[4985]: I0224 10:10:20.821325 4985 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c9e0873cd1521fe3cd14005f22118db4616043670f4b895435e71ddec9d82a0e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:10:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:10:20Z is after 2025-08-24T17:21:41Z" Feb 24 10:10:20 crc kubenswrapper[4985]: I0224 10:10:20.841016 4985 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:16Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:16Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e2f9d78f6a52c55d1f70fa119f84fdb11505d511d394a8ed61a298e79dfb2dcd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:10:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:10:20Z is after 2025-08-24T17:21:41Z" Feb 24 10:10:20 crc kubenswrapper[4985]: I0224 10:10:20.844787 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:10:20 crc kubenswrapper[4985]: I0224 10:10:20.844812 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:10:20 crc kubenswrapper[4985]: I0224 10:10:20.844820 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:10:20 crc kubenswrapper[4985]: I0224 10:10:20.844832 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 10:10:20 crc kubenswrapper[4985]: I0224 10:10:20.844842 4985 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T10:10:20Z","lastTransitionTime":"2026-02-24T10:10:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 10:10:20 crc kubenswrapper[4985]: I0224 10:10:20.852537 4985 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-xkc65" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d4340d1a-60cb-4240-87ba-1e468c9c41cf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lvcfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lvcfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:10:13Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-xkc65\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:10:20Z is after 2025-08-24T17:21:41Z" Feb 24 10:10:20 crc kubenswrapper[4985]: I0224 10:10:20.864724 4985 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:10:20Z is after 2025-08-24T17:21:41Z" Feb 24 10:10:20 crc kubenswrapper[4985]: I0224 10:10:20.877201 4985 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:10:20Z is after 2025-08-24T17:21:41Z" Feb 24 10:10:20 crc kubenswrapper[4985]: I0224 10:10:20.891594 4985 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-q24bf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"731349d2-7b07-4bc9-81f8-c7d75bca842a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://85e965ad144b3440864b3c9de70aacd2a2e49c715dca204b0e4cfae65954cd9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:10:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5g9sh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:10:13Z\\\"}}\" for pod \"openshift-multus\"/\"multus-q24bf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:10:20Z is after 2025-08-24T17:21:41Z" Feb 24 10:10:20 crc kubenswrapper[4985]: I0224 10:10:20.905592 4985 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-kkmm8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c3a34c00-910b-400b-bc96-9d805e076b7c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://65682d7b02ccbc0c344ab0b8eaf55d227d0b6e4a04e56d1038d51aa30747b068\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:10:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2zg8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://52f2dbaf1dfa5fd40bbf8a0d82613a96b05afccc71653f6d1f24db6674950b58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:10:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2zg8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:10:13Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-kkmm8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:10:20Z is after 2025-08-24T17:21:41Z" Feb 24 10:10:20 crc kubenswrapper[4985]: I0224 10:10:20.920299 4985 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a6395bdd-0d8e-4572-ae86-695e87aff12e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:08:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:08:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:08:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:08:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:08:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f9134e7cde3d6c701f928934ed4de91ee13bac8bc62dde5df6fe535a219fb57\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:08:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba53e3e1dfd92b0155e60254f21e53e06da9929560934d7bd95624f4a2bb619c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:08:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d059ea80117284da8c7eb82e4cd878eabba8f9d96102965708b9f6e0f8e96eb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:08:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dea2225b9a0112117f2f0bd6038ef636ac20e0cfcf3d608b720ad7aa2cacfa7c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6ee8b871ab25b79ee67f4e5735673469d19cfb336696170957d2439b0bca13af\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-24T10:09:32Z\\\",\\\"message\\\":\\\"le observer\\\\nW0224 10:09:31.800132 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0224 10:09:31.800281 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0224 10:09:31.801147 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1923306337/tls.crt::/tmp/serving-cert-1923306337/tls.key\\\\\\\"\\\\nI0224 10:09:31.990905 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0224 10:09:31.993717 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0224 10:09:31.993747 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0224 10:09:31.993768 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0224 10:09:31.993775 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0224 10:09:31.997392 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0224 10:09:31.997421 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0224 10:09:31.997427 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0224 10:09:31.997436 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nI0224 10:09:31.997435 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0224 10:09:31.997440 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0224 10:09:31.997467 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0224 10:09:31.997470 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0224 10:09:31.999573 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-24T10:09:31Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:10:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5cf6a4886c046added0a4380f1ca023def94c15e9d5c0ed472c714632684f158\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:08:39Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8da8e0d5478ae714a89c61cf4ca4142bc433bca4d4e4749f069c706abfed8c8f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8da8e0d5478ae714a89c61cf4ca4142bc433bca4d4e4749f069c706abfed8c8f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T10:08:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T10:08:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:08:36Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:10:20Z is after 2025-08-24T17:21:41Z" Feb 24 10:10:20 crc kubenswrapper[4985]: I0224 10:10:20.936129 4985 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-xj9h5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c4bc690c-38e6-488d-97b2-9bf37a916fe4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svj5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5aa381e3ea366641ef4fa423fdd46f8afb44b1b2a2ccb6c754007456149b75e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5aa381e3ea366641ef4fa423fdd46f8afb44b1b2a2ccb6c754007456149b75e6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T10:10:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svj5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e071fe863c4e9eb6ebecb3d58d1c4a3c66577cbe46568a4d80a5de5663636539\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e071fe863c4e9eb6ebecb3d58d1c4a3c66577cbe46568a4d80a5de5663636539\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T10:10:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T10:10:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svj5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ac924a3bb0825085e397dbc788641894675e9556fd8aeadcbf9a24a65e1a076\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4ac924a3bb0825085e397dbc788641894675e9556fd8aeadcbf9a24a65e1a076\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T10:10:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T10:10:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svj5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://91acce3dbd19205b8faf23208b7d65c0c47d8156eaf6d7d3357ce1ca64a3bb88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://91acce3dbd19205b8faf23208b7d65c0c47d8156eaf6d7d3357ce1ca64a3bb88\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T10:10:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T10:10:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svj5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://81482396eb82b6ef2f09d03cf5e6bc785209394a50c63b199581fc70120bb9fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:10:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svj5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svj5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:10:13Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-xj9h5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:10:20Z is after 2025-08-24T17:21:41Z" Feb 24 10:10:20 crc kubenswrapper[4985]: I0224 10:10:20.947427 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:10:20 crc kubenswrapper[4985]: I0224 10:10:20.947484 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:10:20 crc kubenswrapper[4985]: I0224 10:10:20.947501 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:10:20 crc kubenswrapper[4985]: I0224 10:10:20.947522 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 10:10:20 crc kubenswrapper[4985]: I0224 10:10:20.947540 4985 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T10:10:20Z","lastTransitionTime":"2026-02-24T10:10:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 10:10:20 crc kubenswrapper[4985]: I0224 10:10:20.951831 4985 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-hq52w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"11c1c7b8-18df-4583-849f-76b62544344b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8546cf975b145528b7f94bd03c6570de0db1a059477da5795b7569250bde3ca5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:10:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kcfbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://14cb7316040a9cb3cb3236ed16ecc17eee99c6935a646b86604ea8285e1aab0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:10:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kcfbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:10:13Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-hq52w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:10:20Z is after 2025-08-24T17:21:41Z" Feb 24 10:10:20 crc kubenswrapper[4985]: I0224 10:10:20.963233 4985 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-jj7jq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4776cd42-9a5c-4c2e-a585-a1f4a49d2d6d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6f45d6400640ba9b59793abc9aee79bb1c32439ff377abdf8b4ec4d1cd7c336a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:10:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rvmf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:10:13Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-jj7jq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:10:20Z is after 2025-08-24T17:21:41Z" Feb 24 10:10:20 crc kubenswrapper[4985]: I0224 10:10:20.976007 4985 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:10:20Z is after 2025-08-24T17:21:41Z" Feb 24 10:10:20 crc kubenswrapper[4985]: I0224 10:10:20.987864 4985 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://137c9b42aa2109e16d0e273e3e7d116db30ed716c1e6f78fb8f4e87409e3f252\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:10:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://40736fa6ee59064a7fc5bed9b08671d094e7a11b6a8bf4ec45220a0d62156542\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:10:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:10:20Z is after 2025-08-24T17:21:41Z" Feb 24 10:10:20 crc kubenswrapper[4985]: I0224 10:10:20.998202 4985 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 24 10:10:20 crc kubenswrapper[4985]: I0224 10:10:20.998324 4985 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 24 10:10:20 crc kubenswrapper[4985]: I0224 10:10:20.998352 4985 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 24 10:10:20 crc kubenswrapper[4985]: E0224 10:10:20.998380 4985 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-24 10:10:28.998347802 +0000 UTC m=+113.472540372 (durationBeforeRetry 8s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 24 10:10:20 crc kubenswrapper[4985]: I0224 10:10:20.998435 4985 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 24 10:10:20 crc kubenswrapper[4985]: E0224 10:10:20.998460 4985 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 24 10:10:20 crc kubenswrapper[4985]: E0224 10:10:20.998511 4985 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-24 10:10:28.998495626 +0000 UTC m=+113.472688276 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 24 10:10:20 crc kubenswrapper[4985]: E0224 10:10:20.998561 4985 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Feb 24 10:10:20 crc kubenswrapper[4985]: E0224 10:10:20.998585 4985 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 24 10:10:20 crc kubenswrapper[4985]: E0224 10:10:20.998605 4985 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 24 10:10:20 crc kubenswrapper[4985]: E0224 10:10:20.998614 4985 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 24 10:10:20 crc kubenswrapper[4985]: E0224 10:10:20.998643 4985 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-24 10:10:28.9986323 +0000 UTC m=+113.472824970 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Feb 24 10:10:20 crc kubenswrapper[4985]: E0224 10:10:20.998665 4985 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-02-24 10:10:28.99865675 +0000 UTC m=+113.472849440 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 24 10:10:20 crc kubenswrapper[4985]: I0224 10:10:20.999772 4985 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c9e0873cd1521fe3cd14005f22118db4616043670f4b895435e71ddec9d82a0e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:10:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:10:20Z is after 2025-08-24T17:21:41Z" Feb 24 10:10:21 crc kubenswrapper[4985]: I0224 10:10:21.010801 4985 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:16Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:16Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e2f9d78f6a52c55d1f70fa119f84fdb11505d511d394a8ed61a298e79dfb2dcd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:10:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:10:21Z is after 2025-08-24T17:21:41Z" Feb 24 10:10:21 crc kubenswrapper[4985]: I0224 10:10:21.019745 4985 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-xkc65" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d4340d1a-60cb-4240-87ba-1e468c9c41cf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lvcfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lvcfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:10:13Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-xkc65\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:10:21Z is after 2025-08-24T17:21:41Z" Feb 24 10:10:21 crc kubenswrapper[4985]: I0224 10:10:21.034955 4985 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-27dpt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1b3986ef-e9be-43db-9350-ccc7dd3f713f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://97f91c2ddd00ceb58ea5691c7cb057bd7cfe8add5fd1dfd334f39b9def30db0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:10:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9n6c9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://af211a7b5dbd1e1f86f340d63afe03ad2f9691d166810837cc5988a24ad4f323\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:10:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9n6c9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3d76cc1e57df7809a67ce69b6690fd8cbe0ef9c364406b03bbb37bd110d65866\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:10:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9n6c9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cb7b5b1f27320e4bc00e48154c3bb41b1153e36c39a404cfc1d4a246f5909c5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:10:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9n6c9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e04b766ff2b4eb366ace7cbf978ef84bfa922494ec3d28f4cbd75efe1cb54f1e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:10:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9n6c9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7630d2e046fcee78924010a8de1aefcdf7c24ef4e70c4ea89c193983a0d88a61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:10:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9n6c9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://59cf50a579cdd65a2981d9c24099f85b6f8708295e48f9ce8e81c9b2999e6189\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:10:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9n6c9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e3faa6c1361aca1d102b6dbfaf38fa3094a5c2334713ce70a8bf2f58f19d6392\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:10:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9n6c9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8de70126733b3634f4e3c1576c3095c089e9628a6147ce8ea79fa5242a2eb920\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8de70126733b3634f4e3c1576c3095c089e9628a6147ce8ea79fa5242a2eb920\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T10:10:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9n6c9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:10:13Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-27dpt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:10:21Z is after 2025-08-24T17:21:41Z" Feb 24 10:10:21 crc kubenswrapper[4985]: I0224 10:10:21.042854 4985 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-qtqms" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c43e3252-1b22-48a0-8895-ad98fc88b7cf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://778506fd4d9af7b6ebd4e4d46282938146ba0184d9747ca657957c78118c10e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:10:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jdldr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:10:13Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-qtqms\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:10:21Z is after 2025-08-24T17:21:41Z" Feb 24 10:10:21 crc kubenswrapper[4985]: I0224 10:10:21.050917 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:10:21 crc kubenswrapper[4985]: I0224 10:10:21.050957 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:10:21 crc kubenswrapper[4985]: I0224 10:10:21.050969 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:10:21 crc kubenswrapper[4985]: I0224 10:10:21.050988 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 10:10:21 crc kubenswrapper[4985]: I0224 10:10:21.050999 4985 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T10:10:21Z","lastTransitionTime":"2026-02-24T10:10:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 10:10:21 crc kubenswrapper[4985]: I0224 10:10:21.055276 4985 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:10:21Z is after 2025-08-24T17:21:41Z" Feb 24 10:10:21 crc kubenswrapper[4985]: I0224 10:10:21.068936 4985 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:10:21Z is after 2025-08-24T17:21:41Z" Feb 24 10:10:21 crc kubenswrapper[4985]: I0224 10:10:21.080867 4985 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-q24bf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"731349d2-7b07-4bc9-81f8-c7d75bca842a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://85e965ad144b3440864b3c9de70aacd2a2e49c715dca204b0e4cfae65954cd9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:10:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5g9sh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:10:13Z\\\"}}\" for pod \"openshift-multus\"/\"multus-q24bf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:10:21Z is after 2025-08-24T17:21:41Z" Feb 24 10:10:21 crc kubenswrapper[4985]: I0224 10:10:21.092187 4985 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-kkmm8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c3a34c00-910b-400b-bc96-9d805e076b7c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://65682d7b02ccbc0c344ab0b8eaf55d227d0b6e4a04e56d1038d51aa30747b068\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:10:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2zg8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://52f2dbaf1dfa5fd40bbf8a0d82613a96b05afccc71653f6d1f24db6674950b58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:10:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2zg8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:10:13Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-kkmm8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:10:21Z is after 2025-08-24T17:21:41Z" Feb 24 10:10:21 crc kubenswrapper[4985]: I0224 10:10:21.099516 4985 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d4340d1a-60cb-4240-87ba-1e468c9c41cf-metrics-certs\") pod \"network-metrics-daemon-xkc65\" (UID: \"d4340d1a-60cb-4240-87ba-1e468c9c41cf\") " pod="openshift-multus/network-metrics-daemon-xkc65" Feb 24 10:10:21 crc kubenswrapper[4985]: I0224 10:10:21.099601 4985 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 24 10:10:21 crc kubenswrapper[4985]: E0224 10:10:21.099696 4985 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Feb 24 10:10:21 crc kubenswrapper[4985]: E0224 10:10:21.099778 4985 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d4340d1a-60cb-4240-87ba-1e468c9c41cf-metrics-certs podName:d4340d1a-60cb-4240-87ba-1e468c9c41cf nodeName:}" failed. No retries permitted until 2026-02-24 10:10:29.099757119 +0000 UTC m=+113.573949779 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/d4340d1a-60cb-4240-87ba-1e468c9c41cf-metrics-certs") pod "network-metrics-daemon-xkc65" (UID: "d4340d1a-60cb-4240-87ba-1e468c9c41cf") : object "openshift-multus"/"metrics-daemon-secret" not registered Feb 24 10:10:21 crc kubenswrapper[4985]: E0224 10:10:21.099812 4985 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 24 10:10:21 crc kubenswrapper[4985]: E0224 10:10:21.099847 4985 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 24 10:10:21 crc kubenswrapper[4985]: E0224 10:10:21.099861 4985 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 24 10:10:21 crc kubenswrapper[4985]: E0224 10:10:21.099941 4985 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-02-24 10:10:29.099920603 +0000 UTC m=+113.574113243 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 24 10:10:21 crc kubenswrapper[4985]: I0224 10:10:21.107263 4985 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a6395bdd-0d8e-4572-ae86-695e87aff12e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:08:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:08:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:08:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:08:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:08:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f9134e7cde3d6c701f928934ed4de91ee13bac8bc62dde5df6fe535a219fb57\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:08:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba53e3e1dfd92b0155e60254f21e53e06da9929560934d7bd95624f4a2bb619c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:08:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d059ea80117284da8c7eb82e4cd878eabba8f9d96102965708b9f6e0f8e96eb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:08:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dea2225b9a0112117f2f0bd6038ef636ac20e0cfcf3d608b720ad7aa2cacfa7c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6ee8b871ab25b79ee67f4e5735673469d19cfb336696170957d2439b0bca13af\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-24T10:09:32Z\\\",\\\"message\\\":\\\"le observer\\\\nW0224 10:09:31.800132 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0224 10:09:31.800281 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0224 10:09:31.801147 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1923306337/tls.crt::/tmp/serving-cert-1923306337/tls.key\\\\\\\"\\\\nI0224 10:09:31.990905 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0224 10:09:31.993717 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0224 10:09:31.993747 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0224 10:09:31.993768 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0224 10:09:31.993775 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0224 10:09:31.997392 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0224 10:09:31.997421 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0224 10:09:31.997427 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0224 10:09:31.997436 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nI0224 10:09:31.997435 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0224 10:09:31.997440 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0224 10:09:31.997467 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0224 10:09:31.997470 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0224 10:09:31.999573 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-24T10:09:31Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:10:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5cf6a4886c046added0a4380f1ca023def94c15e9d5c0ed472c714632684f158\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:08:39Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8da8e0d5478ae714a89c61cf4ca4142bc433bca4d4e4749f069c706abfed8c8f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8da8e0d5478ae714a89c61cf4ca4142bc433bca4d4e4749f069c706abfed8c8f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T10:08:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T10:08:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:08:36Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:10:21Z is after 2025-08-24T17:21:41Z" Feb 24 10:10:21 crc kubenswrapper[4985]: I0224 10:10:21.119044 4985 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-xj9h5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c4bc690c-38e6-488d-97b2-9bf37a916fe4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svj5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5aa381e3ea366641ef4fa423fdd46f8afb44b1b2a2ccb6c754007456149b75e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5aa381e3ea366641ef4fa423fdd46f8afb44b1b2a2ccb6c754007456149b75e6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T10:10:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svj5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e071fe863c4e9eb6ebecb3d58d1c4a3c66577cbe46568a4d80a5de5663636539\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e071fe863c4e9eb6ebecb3d58d1c4a3c66577cbe46568a4d80a5de5663636539\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T10:10:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T10:10:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svj5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ac924a3bb0825085e397dbc788641894675e9556fd8aeadcbf9a24a65e1a076\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4ac924a3bb0825085e397dbc788641894675e9556fd8aeadcbf9a24a65e1a076\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T10:10:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T10:10:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svj5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://91acce3dbd19205b8faf23208b7d65c0c47d8156eaf6d7d3357ce1ca64a3bb88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://91acce3dbd19205b8faf23208b7d65c0c47d8156eaf6d7d3357ce1ca64a3bb88\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T10:10:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T10:10:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svj5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://81482396eb82b6ef2f09d03cf5e6bc785209394a50c63b199581fc70120bb9fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://81482396eb82b6ef2f09d03cf5e6bc785209394a50c63b199581fc70120bb9fd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T10:10:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T10:10:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svj5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svj5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:10:13Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-xj9h5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:10:21Z is after 2025-08-24T17:21:41Z" Feb 24 10:10:21 crc kubenswrapper[4985]: I0224 10:10:21.128493 4985 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-hq52w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"11c1c7b8-18df-4583-849f-76b62544344b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8546cf975b145528b7f94bd03c6570de0db1a059477da5795b7569250bde3ca5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:10:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kcfbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://14cb7316040a9cb3cb3236ed16ecc17eee99c6935a646b86604ea8285e1aab0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:10:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kcfbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:10:13Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-hq52w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:10:21Z is after 2025-08-24T17:21:41Z" Feb 24 10:10:21 crc kubenswrapper[4985]: I0224 10:10:21.137467 4985 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-jj7jq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4776cd42-9a5c-4c2e-a585-a1f4a49d2d6d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6f45d6400640ba9b59793abc9aee79bb1c32439ff377abdf8b4ec4d1cd7c336a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:10:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rvmf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:10:13Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-jj7jq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:10:21Z is after 2025-08-24T17:21:41Z" Feb 24 10:10:21 crc kubenswrapper[4985]: I0224 10:10:21.146957 4985 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:10:21Z is after 2025-08-24T17:21:41Z" Feb 24 10:10:21 crc kubenswrapper[4985]: I0224 10:10:21.153748 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:10:21 crc kubenswrapper[4985]: I0224 10:10:21.153797 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:10:21 crc kubenswrapper[4985]: I0224 10:10:21.153809 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:10:21 crc kubenswrapper[4985]: I0224 10:10:21.153825 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 10:10:21 crc kubenswrapper[4985]: I0224 10:10:21.153841 4985 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T10:10:21Z","lastTransitionTime":"2026-02-24T10:10:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 10:10:21 crc kubenswrapper[4985]: I0224 10:10:21.159807 4985 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://137c9b42aa2109e16d0e273e3e7d116db30ed716c1e6f78fb8f4e87409e3f252\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:10:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://40736fa6ee59064a7fc5bed9b08671d094e7a11b6a8bf4ec45220a0d62156542\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:10:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:10:21Z is after 2025-08-24T17:21:41Z" Feb 24 10:10:21 crc kubenswrapper[4985]: I0224 10:10:21.255972 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:10:21 crc kubenswrapper[4985]: I0224 10:10:21.256022 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:10:21 crc kubenswrapper[4985]: I0224 10:10:21.256053 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:10:21 crc kubenswrapper[4985]: I0224 10:10:21.256069 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 10:10:21 crc kubenswrapper[4985]: I0224 10:10:21.256081 4985 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T10:10:21Z","lastTransitionTime":"2026-02-24T10:10:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 10:10:21 crc kubenswrapper[4985]: I0224 10:10:21.263980 4985 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xkc65" Feb 24 10:10:21 crc kubenswrapper[4985]: I0224 10:10:21.263992 4985 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 24 10:10:21 crc kubenswrapper[4985]: I0224 10:10:21.263997 4985 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 24 10:10:21 crc kubenswrapper[4985]: I0224 10:10:21.264050 4985 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 24 10:10:21 crc kubenswrapper[4985]: E0224 10:10:21.264102 4985 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xkc65" podUID="d4340d1a-60cb-4240-87ba-1e468c9c41cf" Feb 24 10:10:21 crc kubenswrapper[4985]: E0224 10:10:21.264188 4985 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 24 10:10:21 crc kubenswrapper[4985]: E0224 10:10:21.264284 4985 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 24 10:10:21 crc kubenswrapper[4985]: E0224 10:10:21.264363 4985 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 24 10:10:21 crc kubenswrapper[4985]: I0224 10:10:21.359000 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:10:21 crc kubenswrapper[4985]: I0224 10:10:21.359044 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:10:21 crc kubenswrapper[4985]: I0224 10:10:21.359053 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:10:21 crc kubenswrapper[4985]: I0224 10:10:21.359069 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 10:10:21 crc kubenswrapper[4985]: I0224 10:10:21.359078 4985 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T10:10:21Z","lastTransitionTime":"2026-02-24T10:10:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 10:10:21 crc kubenswrapper[4985]: I0224 10:10:21.461010 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:10:21 crc kubenswrapper[4985]: I0224 10:10:21.461057 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:10:21 crc kubenswrapper[4985]: I0224 10:10:21.461071 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:10:21 crc kubenswrapper[4985]: I0224 10:10:21.461089 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 10:10:21 crc kubenswrapper[4985]: I0224 10:10:21.461104 4985 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T10:10:21Z","lastTransitionTime":"2026-02-24T10:10:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 10:10:21 crc kubenswrapper[4985]: I0224 10:10:21.564181 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:10:21 crc kubenswrapper[4985]: I0224 10:10:21.564224 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:10:21 crc kubenswrapper[4985]: I0224 10:10:21.564234 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:10:21 crc kubenswrapper[4985]: I0224 10:10:21.564249 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 10:10:21 crc kubenswrapper[4985]: I0224 10:10:21.564261 4985 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T10:10:21Z","lastTransitionTime":"2026-02-24T10:10:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 10:10:21 crc kubenswrapper[4985]: I0224 10:10:21.667723 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:10:21 crc kubenswrapper[4985]: I0224 10:10:21.667783 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:10:21 crc kubenswrapper[4985]: I0224 10:10:21.667800 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:10:21 crc kubenswrapper[4985]: I0224 10:10:21.667827 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 10:10:21 crc kubenswrapper[4985]: I0224 10:10:21.667844 4985 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T10:10:21Z","lastTransitionTime":"2026-02-24T10:10:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 10:10:21 crc kubenswrapper[4985]: I0224 10:10:21.763524 4985 generic.go:334] "Generic (PLEG): container finished" podID="c4bc690c-38e6-488d-97b2-9bf37a916fe4" containerID="556b84b0e9e72b0c26c58defa2c030943daffa83d97b60293cc5b5627a6c9174" exitCode=0 Feb 24 10:10:21 crc kubenswrapper[4985]: I0224 10:10:21.763569 4985 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-xj9h5" event={"ID":"c4bc690c-38e6-488d-97b2-9bf37a916fe4","Type":"ContainerDied","Data":"556b84b0e9e72b0c26c58defa2c030943daffa83d97b60293cc5b5627a6c9174"} Feb 24 10:10:21 crc kubenswrapper[4985]: I0224 10:10:21.770191 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:10:21 crc kubenswrapper[4985]: I0224 10:10:21.770243 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:10:21 crc kubenswrapper[4985]: I0224 10:10:21.770260 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:10:21 crc kubenswrapper[4985]: I0224 10:10:21.770284 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 10:10:21 crc kubenswrapper[4985]: I0224 10:10:21.770302 4985 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T10:10:21Z","lastTransitionTime":"2026-02-24T10:10:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 10:10:21 crc kubenswrapper[4985]: I0224 10:10:21.778969 4985 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-qtqms" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c43e3252-1b22-48a0-8895-ad98fc88b7cf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://778506fd4d9af7b6ebd4e4d46282938146ba0184d9747ca657957c78118c10e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:10:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jdldr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:10:13Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-qtqms\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:10:21Z is after 2025-08-24T17:21:41Z" Feb 24 10:10:21 crc kubenswrapper[4985]: I0224 10:10:21.794038 4985 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c9e0873cd1521fe3cd14005f22118db4616043670f4b895435e71ddec9d82a0e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:10:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:10:21Z is after 2025-08-24T17:21:41Z" Feb 24 10:10:21 crc kubenswrapper[4985]: I0224 10:10:21.811111 4985 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:16Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:16Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e2f9d78f6a52c55d1f70fa119f84fdb11505d511d394a8ed61a298e79dfb2dcd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:10:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:10:21Z is after 2025-08-24T17:21:41Z" Feb 24 10:10:21 crc kubenswrapper[4985]: I0224 10:10:21.823193 4985 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-xkc65" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d4340d1a-60cb-4240-87ba-1e468c9c41cf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lvcfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lvcfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:10:13Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-xkc65\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:10:21Z is after 2025-08-24T17:21:41Z" Feb 24 10:10:21 crc kubenswrapper[4985]: I0224 10:10:21.852023 4985 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-27dpt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1b3986ef-e9be-43db-9350-ccc7dd3f713f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://97f91c2ddd00ceb58ea5691c7cb057bd7cfe8add5fd1dfd334f39b9def30db0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:10:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9n6c9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://af211a7b5dbd1e1f86f340d63afe03ad2f9691d166810837cc5988a24ad4f323\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:10:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9n6c9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3d76cc1e57df7809a67ce69b6690fd8cbe0ef9c364406b03bbb37bd110d65866\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:10:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9n6c9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cb7b5b1f27320e4bc00e48154c3bb41b1153e36c39a404cfc1d4a246f5909c5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:10:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9n6c9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e04b766ff2b4eb366ace7cbf978ef84bfa922494ec3d28f4cbd75efe1cb54f1e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:10:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9n6c9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7630d2e046fcee78924010a8de1aefcdf7c24ef4e70c4ea89c193983a0d88a61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:10:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9n6c9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://59cf50a579cdd65a2981d9c24099f85b6f8708295e48f9ce8e81c9b2999e6189\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:10:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9n6c9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e3faa6c1361aca1d102b6dbfaf38fa3094a5c2334713ce70a8bf2f58f19d6392\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:10:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9n6c9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8de70126733b3634f4e3c1576c3095c089e9628a6147ce8ea79fa5242a2eb920\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8de70126733b3634f4e3c1576c3095c089e9628a6147ce8ea79fa5242a2eb920\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T10:10:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9n6c9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:10:13Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-27dpt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:10:21Z is after 2025-08-24T17:21:41Z" Feb 24 10:10:21 crc kubenswrapper[4985]: I0224 10:10:21.870074 4985 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:10:21Z is after 2025-08-24T17:21:41Z" Feb 24 10:10:21 crc kubenswrapper[4985]: I0224 10:10:21.876353 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:10:21 crc kubenswrapper[4985]: I0224 10:10:21.876402 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:10:21 crc kubenswrapper[4985]: I0224 10:10:21.876415 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:10:21 crc kubenswrapper[4985]: I0224 10:10:21.876434 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 10:10:21 crc kubenswrapper[4985]: I0224 10:10:21.876448 4985 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T10:10:21Z","lastTransitionTime":"2026-02-24T10:10:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 10:10:21 crc kubenswrapper[4985]: I0224 10:10:21.892952 4985 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:10:21Z is after 2025-08-24T17:21:41Z" Feb 24 10:10:21 crc kubenswrapper[4985]: I0224 10:10:21.906297 4985 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-q24bf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"731349d2-7b07-4bc9-81f8-c7d75bca842a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://85e965ad144b3440864b3c9de70aacd2a2e49c715dca204b0e4cfae65954cd9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:10:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5g9sh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:10:13Z\\\"}}\" for pod \"openshift-multus\"/\"multus-q24bf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:10:21Z is after 2025-08-24T17:21:41Z" Feb 24 10:10:21 crc kubenswrapper[4985]: I0224 10:10:21.916919 4985 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-kkmm8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c3a34c00-910b-400b-bc96-9d805e076b7c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://65682d7b02ccbc0c344ab0b8eaf55d227d0b6e4a04e56d1038d51aa30747b068\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:10:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2zg8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://52f2dbaf1dfa5fd40bbf8a0d82613a96b05afccc71653f6d1f24db6674950b58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:10:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2zg8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:10:13Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-kkmm8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:10:21Z is after 2025-08-24T17:21:41Z" Feb 24 10:10:21 crc kubenswrapper[4985]: I0224 10:10:21.933107 4985 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a6395bdd-0d8e-4572-ae86-695e87aff12e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:08:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:08:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:08:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:08:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:08:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f9134e7cde3d6c701f928934ed4de91ee13bac8bc62dde5df6fe535a219fb57\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:08:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba53e3e1dfd92b0155e60254f21e53e06da9929560934d7bd95624f4a2bb619c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:08:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d059ea80117284da8c7eb82e4cd878eabba8f9d96102965708b9f6e0f8e96eb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:08:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dea2225b9a0112117f2f0bd6038ef636ac20e0cfcf3d608b720ad7aa2cacfa7c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6ee8b871ab25b79ee67f4e5735673469d19cfb336696170957d2439b0bca13af\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-24T10:09:32Z\\\",\\\"message\\\":\\\"le observer\\\\nW0224 10:09:31.800132 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0224 10:09:31.800281 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0224 10:09:31.801147 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1923306337/tls.crt::/tmp/serving-cert-1923306337/tls.key\\\\\\\"\\\\nI0224 10:09:31.990905 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0224 10:09:31.993717 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0224 10:09:31.993747 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0224 10:09:31.993768 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0224 10:09:31.993775 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0224 10:09:31.997392 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0224 10:09:31.997421 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0224 10:09:31.997427 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0224 10:09:31.997436 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nI0224 10:09:31.997435 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0224 10:09:31.997440 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0224 10:09:31.997467 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0224 10:09:31.997470 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0224 10:09:31.999573 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-24T10:09:31Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:10:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5cf6a4886c046added0a4380f1ca023def94c15e9d5c0ed472c714632684f158\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:08:39Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8da8e0d5478ae714a89c61cf4ca4142bc433bca4d4e4749f069c706abfed8c8f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8da8e0d5478ae714a89c61cf4ca4142bc433bca4d4e4749f069c706abfed8c8f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T10:08:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T10:08:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:08:36Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:10:21Z is after 2025-08-24T17:21:41Z" Feb 24 10:10:21 crc kubenswrapper[4985]: I0224 10:10:21.948793 4985 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-xj9h5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c4bc690c-38e6-488d-97b2-9bf37a916fe4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svj5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5aa381e3ea366641ef4fa423fdd46f8afb44b1b2a2ccb6c754007456149b75e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5aa381e3ea366641ef4fa423fdd46f8afb44b1b2a2ccb6c754007456149b75e6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T10:10:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svj5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e071fe863c4e9eb6ebecb3d58d1c4a3c66577cbe46568a4d80a5de5663636539\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e071fe863c4e9eb6ebecb3d58d1c4a3c66577cbe46568a4d80a5de5663636539\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T10:10:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T10:10:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svj5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ac924a3bb0825085e397dbc788641894675e9556fd8aeadcbf9a24a65e1a076\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4ac924a3bb0825085e397dbc788641894675e9556fd8aeadcbf9a24a65e1a076\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T10:10:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T10:10:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svj5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://91acce3dbd19205b8faf23208b7d65c0c47d8156eaf6d7d3357ce1ca64a3bb88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://91acce3dbd19205b8faf23208b7d65c0c47d8156eaf6d7d3357ce1ca64a3bb88\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T10:10:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T10:10:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svj5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://81482396eb82b6ef2f09d03cf5e6bc785209394a50c63b199581fc70120bb9fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://81482396eb82b6ef2f09d03cf5e6bc785209394a50c63b199581fc70120bb9fd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T10:10:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T10:10:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svj5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://556b84b0e9e72b0c26c58defa2c030943daffa83d97b60293cc5b5627a6c9174\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://556b84b0e9e72b0c26c58defa2c030943daffa83d97b60293cc5b5627a6c9174\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T10:10:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T10:10:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svj5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:10:13Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-xj9h5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:10:21Z is after 2025-08-24T17:21:41Z" Feb 24 10:10:21 crc kubenswrapper[4985]: I0224 10:10:21.962958 4985 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-hq52w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"11c1c7b8-18df-4583-849f-76b62544344b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8546cf975b145528b7f94bd03c6570de0db1a059477da5795b7569250bde3ca5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:10:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kcfbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://14cb7316040a9cb3cb3236ed16ecc17eee99c6935a646b86604ea8285e1aab0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:10:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kcfbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:10:13Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-hq52w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:10:21Z is after 2025-08-24T17:21:41Z" Feb 24 10:10:21 crc kubenswrapper[4985]: I0224 10:10:21.974437 4985 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-jj7jq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4776cd42-9a5c-4c2e-a585-a1f4a49d2d6d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6f45d6400640ba9b59793abc9aee79bb1c32439ff377abdf8b4ec4d1cd7c336a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:10:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rvmf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:10:13Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-jj7jq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:10:21Z is after 2025-08-24T17:21:41Z" Feb 24 10:10:21 crc kubenswrapper[4985]: I0224 10:10:21.984920 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:10:21 crc kubenswrapper[4985]: I0224 10:10:21.984960 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:10:21 crc kubenswrapper[4985]: I0224 10:10:21.984970 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:10:21 crc kubenswrapper[4985]: I0224 10:10:21.984986 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 10:10:21 crc kubenswrapper[4985]: I0224 10:10:21.984996 4985 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T10:10:21Z","lastTransitionTime":"2026-02-24T10:10:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 10:10:21 crc kubenswrapper[4985]: I0224 10:10:21.990570 4985 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:10:21Z is after 2025-08-24T17:21:41Z" Feb 24 10:10:22 crc kubenswrapper[4985]: I0224 10:10:22.002846 4985 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://137c9b42aa2109e16d0e273e3e7d116db30ed716c1e6f78fb8f4e87409e3f252\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:10:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://40736fa6ee59064a7fc5bed9b08671d094e7a11b6a8bf4ec45220a0d62156542\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:10:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:10:21Z is after 2025-08-24T17:21:41Z" Feb 24 10:10:22 crc kubenswrapper[4985]: I0224 10:10:22.087146 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:10:22 crc kubenswrapper[4985]: I0224 10:10:22.087186 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:10:22 crc kubenswrapper[4985]: I0224 10:10:22.087197 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:10:22 crc kubenswrapper[4985]: I0224 10:10:22.087215 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 10:10:22 crc kubenswrapper[4985]: I0224 10:10:22.087229 4985 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T10:10:22Z","lastTransitionTime":"2026-02-24T10:10:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 10:10:22 crc kubenswrapper[4985]: I0224 10:10:22.189477 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:10:22 crc kubenswrapper[4985]: I0224 10:10:22.189508 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:10:22 crc kubenswrapper[4985]: I0224 10:10:22.189538 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:10:22 crc kubenswrapper[4985]: I0224 10:10:22.189557 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 10:10:22 crc kubenswrapper[4985]: I0224 10:10:22.189565 4985 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T10:10:22Z","lastTransitionTime":"2026-02-24T10:10:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 10:10:22 crc kubenswrapper[4985]: I0224 10:10:22.292008 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:10:22 crc kubenswrapper[4985]: I0224 10:10:22.292045 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:10:22 crc kubenswrapper[4985]: I0224 10:10:22.292056 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:10:22 crc kubenswrapper[4985]: I0224 10:10:22.292073 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 10:10:22 crc kubenswrapper[4985]: I0224 10:10:22.292084 4985 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T10:10:22Z","lastTransitionTime":"2026-02-24T10:10:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 10:10:22 crc kubenswrapper[4985]: I0224 10:10:22.394308 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:10:22 crc kubenswrapper[4985]: I0224 10:10:22.394364 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:10:22 crc kubenswrapper[4985]: I0224 10:10:22.394383 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:10:22 crc kubenswrapper[4985]: I0224 10:10:22.394419 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 10:10:22 crc kubenswrapper[4985]: I0224 10:10:22.394439 4985 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T10:10:22Z","lastTransitionTime":"2026-02-24T10:10:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 10:10:22 crc kubenswrapper[4985]: I0224 10:10:22.497349 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:10:22 crc kubenswrapper[4985]: I0224 10:10:22.497398 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:10:22 crc kubenswrapper[4985]: I0224 10:10:22.497410 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:10:22 crc kubenswrapper[4985]: I0224 10:10:22.497429 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 10:10:22 crc kubenswrapper[4985]: I0224 10:10:22.497441 4985 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T10:10:22Z","lastTransitionTime":"2026-02-24T10:10:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 10:10:22 crc kubenswrapper[4985]: I0224 10:10:22.600531 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:10:22 crc kubenswrapper[4985]: I0224 10:10:22.600622 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:10:22 crc kubenswrapper[4985]: I0224 10:10:22.600649 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:10:22 crc kubenswrapper[4985]: I0224 10:10:22.600683 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 10:10:22 crc kubenswrapper[4985]: I0224 10:10:22.600707 4985 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T10:10:22Z","lastTransitionTime":"2026-02-24T10:10:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 10:10:22 crc kubenswrapper[4985]: I0224 10:10:22.707807 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:10:22 crc kubenswrapper[4985]: I0224 10:10:22.707874 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:10:22 crc kubenswrapper[4985]: I0224 10:10:22.707918 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:10:22 crc kubenswrapper[4985]: I0224 10:10:22.707942 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 10:10:22 crc kubenswrapper[4985]: I0224 10:10:22.707959 4985 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T10:10:22Z","lastTransitionTime":"2026-02-24T10:10:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 10:10:22 crc kubenswrapper[4985]: I0224 10:10:22.772987 4985 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-xj9h5" event={"ID":"c4bc690c-38e6-488d-97b2-9bf37a916fe4","Type":"ContainerStarted","Data":"ab4787f21b4cef49e96547a1d1e7105c79d01d7863afd7ea013f9a38cb40dd06"} Feb 24 10:10:22 crc kubenswrapper[4985]: I0224 10:10:22.793997 4985 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:10:22Z is after 2025-08-24T17:21:41Z" Feb 24 10:10:22 crc kubenswrapper[4985]: I0224 10:10:22.811134 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:10:22 crc kubenswrapper[4985]: I0224 10:10:22.811178 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:10:22 crc kubenswrapper[4985]: I0224 10:10:22.811197 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:10:22 crc kubenswrapper[4985]: I0224 10:10:22.811220 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 10:10:22 crc kubenswrapper[4985]: I0224 10:10:22.811238 4985 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T10:10:22Z","lastTransitionTime":"2026-02-24T10:10:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 10:10:22 crc kubenswrapper[4985]: I0224 10:10:22.816941 4985 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-q24bf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"731349d2-7b07-4bc9-81f8-c7d75bca842a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://85e965ad144b3440864b3c9de70aacd2a2e49c715dca204b0e4cfae65954cd9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:10:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5g9sh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:10:13Z\\\"}}\" for pod \"openshift-multus\"/\"multus-q24bf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:10:22Z is after 2025-08-24T17:21:41Z" Feb 24 10:10:22 crc kubenswrapper[4985]: I0224 10:10:22.836707 4985 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-kkmm8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c3a34c00-910b-400b-bc96-9d805e076b7c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://65682d7b02ccbc0c344ab0b8eaf55d227d0b6e4a04e56d1038d51aa30747b068\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:10:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2zg8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://52f2dbaf1dfa5fd40bbf8a0d82613a96b05afccc71653f6d1f24db6674950b58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:10:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2zg8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:10:13Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-kkmm8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:10:22Z is after 2025-08-24T17:21:41Z" Feb 24 10:10:22 crc kubenswrapper[4985]: I0224 10:10:22.853297 4985 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:10:22Z is after 2025-08-24T17:21:41Z" Feb 24 10:10:22 crc kubenswrapper[4985]: I0224 10:10:22.871019 4985 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-hq52w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"11c1c7b8-18df-4583-849f-76b62544344b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8546cf975b145528b7f94bd03c6570de0db1a059477da5795b7569250bde3ca5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:10:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kcfbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://14cb7316040a9cb3cb3236ed16ecc17eee99c6935a646b86604ea8285e1aab0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:10:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kcfbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:10:13Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-hq52w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:10:22Z is after 2025-08-24T17:21:41Z" Feb 24 10:10:22 crc kubenswrapper[4985]: I0224 10:10:22.888099 4985 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-jj7jq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4776cd42-9a5c-4c2e-a585-a1f4a49d2d6d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6f45d6400640ba9b59793abc9aee79bb1c32439ff377abdf8b4ec4d1cd7c336a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:10:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rvmf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:10:13Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-jj7jq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:10:22Z is after 2025-08-24T17:21:41Z" Feb 24 10:10:22 crc kubenswrapper[4985]: I0224 10:10:22.907855 4985 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a6395bdd-0d8e-4572-ae86-695e87aff12e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:08:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:08:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:08:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:08:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:08:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f9134e7cde3d6c701f928934ed4de91ee13bac8bc62dde5df6fe535a219fb57\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:08:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba53e3e1dfd92b0155e60254f21e53e06da9929560934d7bd95624f4a2bb619c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:08:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d059ea80117284da8c7eb82e4cd878eabba8f9d96102965708b9f6e0f8e96eb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:08:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dea2225b9a0112117f2f0bd6038ef636ac20e0cfcf3d608b720ad7aa2cacfa7c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6ee8b871ab25b79ee67f4e5735673469d19cfb336696170957d2439b0bca13af\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-24T10:09:32Z\\\",\\\"message\\\":\\\"le observer\\\\nW0224 10:09:31.800132 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0224 10:09:31.800281 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0224 10:09:31.801147 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1923306337/tls.crt::/tmp/serving-cert-1923306337/tls.key\\\\\\\"\\\\nI0224 10:09:31.990905 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0224 10:09:31.993717 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0224 10:09:31.993747 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0224 10:09:31.993768 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0224 10:09:31.993775 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0224 10:09:31.997392 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0224 10:09:31.997421 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0224 10:09:31.997427 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0224 10:09:31.997436 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nI0224 10:09:31.997435 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0224 10:09:31.997440 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0224 10:09:31.997467 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0224 10:09:31.997470 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0224 10:09:31.999573 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-24T10:09:31Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:10:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5cf6a4886c046added0a4380f1ca023def94c15e9d5c0ed472c714632684f158\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:08:39Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8da8e0d5478ae714a89c61cf4ca4142bc433bca4d4e4749f069c706abfed8c8f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8da8e0d5478ae714a89c61cf4ca4142bc433bca4d4e4749f069c706abfed8c8f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T10:08:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T10:08:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:08:36Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:10:22Z is after 2025-08-24T17:21:41Z" Feb 24 10:10:22 crc kubenswrapper[4985]: I0224 10:10:22.914229 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:10:22 crc kubenswrapper[4985]: I0224 10:10:22.914285 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:10:22 crc kubenswrapper[4985]: I0224 10:10:22.914298 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:10:22 crc kubenswrapper[4985]: I0224 10:10:22.914316 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 10:10:22 crc kubenswrapper[4985]: I0224 10:10:22.914328 4985 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T10:10:22Z","lastTransitionTime":"2026-02-24T10:10:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 10:10:22 crc kubenswrapper[4985]: I0224 10:10:22.936933 4985 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-xj9h5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c4bc690c-38e6-488d-97b2-9bf37a916fe4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ab4787f21b4cef49e96547a1d1e7105c79d01d7863afd7ea013f9a38cb40dd06\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:10:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svj5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5aa381e3ea366641ef4fa423fdd46f8afb44b1b2a2ccb6c754007456149b75e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5aa381e3ea366641ef4fa423fdd46f8afb44b1b2a2ccb6c754007456149b75e6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T10:10:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svj5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e071fe863c4e9eb6ebecb3d58d1c4a3c66577cbe46568a4d80a5de5663636539\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e071fe863c4e9eb6ebecb3d58d1c4a3c66577cbe46568a4d80a5de5663636539\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T10:10:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T10:10:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svj5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ac924a3bb0825085e397dbc788641894675e9556fd8aeadcbf9a24a65e1a076\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4ac924a3bb0825085e397dbc788641894675e9556fd8aeadcbf9a24a65e1a076\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T10:10:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T10:10:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svj5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://91acce3dbd19205b8faf23208b7d65c0c47d8156eaf6d7d3357ce1ca64a3bb88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://91acce3dbd19205b8faf23208b7d65c0c47d8156eaf6d7d3357ce1ca64a3bb88\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T10:10:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T10:10:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svj5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://81482396eb82b6ef2f09d03cf5e6bc785209394a50c63b199581fc70120bb9fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://81482396eb82b6ef2f09d03cf5e6bc785209394a50c63b199581fc70120bb9fd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T10:10:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T10:10:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svj5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://556b84b0e9e72b0c26c58defa2c030943daffa83d97b60293cc5b5627a6c9174\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://556b84b0e9e72b0c26c58defa2c030943daffa83d97b60293cc5b5627a6c9174\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T10:10:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T10:10:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svj5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:10:13Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-xj9h5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:10:22Z is after 2025-08-24T17:21:41Z" Feb 24 10:10:22 crc kubenswrapper[4985]: I0224 10:10:22.948689 4985 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:10:22Z is after 2025-08-24T17:21:41Z" Feb 24 10:10:22 crc kubenswrapper[4985]: I0224 10:10:22.969648 4985 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://137c9b42aa2109e16d0e273e3e7d116db30ed716c1e6f78fb8f4e87409e3f252\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:10:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://40736fa6ee59064a7fc5bed9b08671d094e7a11b6a8bf4ec45220a0d62156542\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:10:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:10:22Z is after 2025-08-24T17:21:41Z" Feb 24 10:10:22 crc kubenswrapper[4985]: I0224 10:10:22.984715 4985 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c9e0873cd1521fe3cd14005f22118db4616043670f4b895435e71ddec9d82a0e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:10:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:10:22Z is after 2025-08-24T17:21:41Z" Feb 24 10:10:22 crc kubenswrapper[4985]: I0224 10:10:22.999709 4985 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:16Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:16Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e2f9d78f6a52c55d1f70fa119f84fdb11505d511d394a8ed61a298e79dfb2dcd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:10:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:10:22Z is after 2025-08-24T17:21:41Z" Feb 24 10:10:23 crc kubenswrapper[4985]: I0224 10:10:23.011553 4985 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-xkc65" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d4340d1a-60cb-4240-87ba-1e468c9c41cf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lvcfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lvcfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:10:13Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-xkc65\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:10:23Z is after 2025-08-24T17:21:41Z" Feb 24 10:10:23 crc kubenswrapper[4985]: I0224 10:10:23.016930 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:10:23 crc kubenswrapper[4985]: I0224 10:10:23.016962 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:10:23 crc kubenswrapper[4985]: I0224 10:10:23.016972 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:10:23 crc kubenswrapper[4985]: I0224 10:10:23.016987 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 10:10:23 crc kubenswrapper[4985]: I0224 10:10:23.016998 4985 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T10:10:23Z","lastTransitionTime":"2026-02-24T10:10:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 10:10:23 crc kubenswrapper[4985]: I0224 10:10:23.036235 4985 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-27dpt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1b3986ef-e9be-43db-9350-ccc7dd3f713f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://97f91c2ddd00ceb58ea5691c7cb057bd7cfe8add5fd1dfd334f39b9def30db0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:10:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9n6c9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://af211a7b5dbd1e1f86f340d63afe03ad2f9691d166810837cc5988a24ad4f323\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:10:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9n6c9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3d76cc1e57df7809a67ce69b6690fd8cbe0ef9c364406b03bbb37bd110d65866\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:10:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9n6c9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cb7b5b1f27320e4bc00e48154c3bb41b1153e36c39a404cfc1d4a246f5909c5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:10:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9n6c9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e04b766ff2b4eb366ace7cbf978ef84bfa922494ec3d28f4cbd75efe1cb54f1e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:10:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9n6c9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7630d2e046fcee78924010a8de1aefcdf7c24ef4e70c4ea89c193983a0d88a61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:10:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9n6c9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://59cf50a579cdd65a2981d9c24099f85b6f8708295e48f9ce8e81c9b2999e6189\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:10:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9n6c9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e3faa6c1361aca1d102b6dbfaf38fa3094a5c2334713ce70a8bf2f58f19d6392\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:10:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9n6c9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8de70126733b3634f4e3c1576c3095c089e9628a6147ce8ea79fa5242a2eb920\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8de70126733b3634f4e3c1576c3095c089e9628a6147ce8ea79fa5242a2eb920\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T10:10:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9n6c9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:10:13Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-27dpt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:10:23Z is after 2025-08-24T17:21:41Z" Feb 24 10:10:23 crc kubenswrapper[4985]: I0224 10:10:23.046464 4985 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-qtqms" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c43e3252-1b22-48a0-8895-ad98fc88b7cf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://778506fd4d9af7b6ebd4e4d46282938146ba0184d9747ca657957c78118c10e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:10:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jdldr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:10:13Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-qtqms\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:10:23Z is after 2025-08-24T17:21:41Z" Feb 24 10:10:23 crc kubenswrapper[4985]: I0224 10:10:23.119734 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:10:23 crc kubenswrapper[4985]: I0224 10:10:23.119775 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:10:23 crc kubenswrapper[4985]: I0224 10:10:23.119787 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:10:23 crc kubenswrapper[4985]: I0224 10:10:23.119802 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 10:10:23 crc kubenswrapper[4985]: I0224 10:10:23.119815 4985 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T10:10:23Z","lastTransitionTime":"2026-02-24T10:10:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 10:10:23 crc kubenswrapper[4985]: I0224 10:10:23.222086 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:10:23 crc kubenswrapper[4985]: I0224 10:10:23.222131 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:10:23 crc kubenswrapper[4985]: I0224 10:10:23.222141 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:10:23 crc kubenswrapper[4985]: I0224 10:10:23.222155 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 10:10:23 crc kubenswrapper[4985]: I0224 10:10:23.222166 4985 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T10:10:23Z","lastTransitionTime":"2026-02-24T10:10:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 10:10:23 crc kubenswrapper[4985]: I0224 10:10:23.263801 4985 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 24 10:10:23 crc kubenswrapper[4985]: I0224 10:10:23.263845 4985 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 24 10:10:23 crc kubenswrapper[4985]: I0224 10:10:23.263844 4985 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 24 10:10:23 crc kubenswrapper[4985]: I0224 10:10:23.263801 4985 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xkc65" Feb 24 10:10:23 crc kubenswrapper[4985]: E0224 10:10:23.263933 4985 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 24 10:10:23 crc kubenswrapper[4985]: E0224 10:10:23.264054 4985 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 24 10:10:23 crc kubenswrapper[4985]: E0224 10:10:23.264094 4985 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xkc65" podUID="d4340d1a-60cb-4240-87ba-1e468c9c41cf" Feb 24 10:10:23 crc kubenswrapper[4985]: E0224 10:10:23.264153 4985 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 24 10:10:23 crc kubenswrapper[4985]: I0224 10:10:23.324738 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:10:23 crc kubenswrapper[4985]: I0224 10:10:23.324771 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:10:23 crc kubenswrapper[4985]: I0224 10:10:23.324780 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:10:23 crc kubenswrapper[4985]: I0224 10:10:23.324795 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 10:10:23 crc kubenswrapper[4985]: I0224 10:10:23.324804 4985 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T10:10:23Z","lastTransitionTime":"2026-02-24T10:10:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 10:10:23 crc kubenswrapper[4985]: I0224 10:10:23.427279 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:10:23 crc kubenswrapper[4985]: I0224 10:10:23.427334 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:10:23 crc kubenswrapper[4985]: I0224 10:10:23.427345 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:10:23 crc kubenswrapper[4985]: I0224 10:10:23.427363 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 10:10:23 crc kubenswrapper[4985]: I0224 10:10:23.427377 4985 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T10:10:23Z","lastTransitionTime":"2026-02-24T10:10:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 10:10:23 crc kubenswrapper[4985]: I0224 10:10:23.530141 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:10:23 crc kubenswrapper[4985]: I0224 10:10:23.530198 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:10:23 crc kubenswrapper[4985]: I0224 10:10:23.530217 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:10:23 crc kubenswrapper[4985]: I0224 10:10:23.530239 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 10:10:23 crc kubenswrapper[4985]: I0224 10:10:23.530257 4985 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T10:10:23Z","lastTransitionTime":"2026-02-24T10:10:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 10:10:23 crc kubenswrapper[4985]: I0224 10:10:23.632662 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:10:23 crc kubenswrapper[4985]: I0224 10:10:23.632719 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:10:23 crc kubenswrapper[4985]: I0224 10:10:23.632736 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:10:23 crc kubenswrapper[4985]: I0224 10:10:23.632763 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 10:10:23 crc kubenswrapper[4985]: I0224 10:10:23.632781 4985 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T10:10:23Z","lastTransitionTime":"2026-02-24T10:10:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 10:10:23 crc kubenswrapper[4985]: I0224 10:10:23.735875 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:10:23 crc kubenswrapper[4985]: I0224 10:10:23.735958 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:10:23 crc kubenswrapper[4985]: I0224 10:10:23.735976 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:10:23 crc kubenswrapper[4985]: I0224 10:10:23.736003 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 10:10:23 crc kubenswrapper[4985]: I0224 10:10:23.736020 4985 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T10:10:23Z","lastTransitionTime":"2026-02-24T10:10:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 10:10:23 crc kubenswrapper[4985]: I0224 10:10:23.778480 4985 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-27dpt_1b3986ef-e9be-43db-9350-ccc7dd3f713f/ovnkube-controller/0.log" Feb 24 10:10:23 crc kubenswrapper[4985]: I0224 10:10:23.781202 4985 generic.go:334] "Generic (PLEG): container finished" podID="1b3986ef-e9be-43db-9350-ccc7dd3f713f" containerID="59cf50a579cdd65a2981d9c24099f85b6f8708295e48f9ce8e81c9b2999e6189" exitCode=1 Feb 24 10:10:23 crc kubenswrapper[4985]: I0224 10:10:23.781703 4985 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-27dpt" event={"ID":"1b3986ef-e9be-43db-9350-ccc7dd3f713f","Type":"ContainerDied","Data":"59cf50a579cdd65a2981d9c24099f85b6f8708295e48f9ce8e81c9b2999e6189"} Feb 24 10:10:23 crc kubenswrapper[4985]: I0224 10:10:23.782195 4985 scope.go:117] "RemoveContainer" containerID="59cf50a579cdd65a2981d9c24099f85b6f8708295e48f9ce8e81c9b2999e6189" Feb 24 10:10:23 crc kubenswrapper[4985]: I0224 10:10:23.802533 4985 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c9e0873cd1521fe3cd14005f22118db4616043670f4b895435e71ddec9d82a0e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:10:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:10:23Z is after 2025-08-24T17:21:41Z" Feb 24 10:10:23 crc kubenswrapper[4985]: I0224 10:10:23.814659 4985 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:16Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:16Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e2f9d78f6a52c55d1f70fa119f84fdb11505d511d394a8ed61a298e79dfb2dcd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:10:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:10:23Z is after 2025-08-24T17:21:41Z" Feb 24 10:10:23 crc kubenswrapper[4985]: I0224 10:10:23.826122 4985 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-xkc65" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d4340d1a-60cb-4240-87ba-1e468c9c41cf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lvcfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lvcfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:10:13Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-xkc65\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:10:23Z is after 2025-08-24T17:21:41Z" Feb 24 10:10:23 crc kubenswrapper[4985]: I0224 10:10:23.837670 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:10:23 crc kubenswrapper[4985]: I0224 10:10:23.837721 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:10:23 crc kubenswrapper[4985]: I0224 10:10:23.837735 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:10:23 crc kubenswrapper[4985]: I0224 10:10:23.837754 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 10:10:23 crc kubenswrapper[4985]: I0224 10:10:23.837767 4985 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T10:10:23Z","lastTransitionTime":"2026-02-24T10:10:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 10:10:23 crc kubenswrapper[4985]: I0224 10:10:23.851074 4985 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-27dpt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1b3986ef-e9be-43db-9350-ccc7dd3f713f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://97f91c2ddd00ceb58ea5691c7cb057bd7cfe8add5fd1dfd334f39b9def30db0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:10:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9n6c9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://af211a7b5dbd1e1f86f340d63afe03ad2f9691d166810837cc5988a24ad4f323\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:10:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9n6c9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3d76cc1e57df7809a67ce69b6690fd8cbe0ef9c364406b03bbb37bd110d65866\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:10:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9n6c9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cb7b5b1f27320e4bc00e48154c3bb41b1153e36c39a404cfc1d4a246f5909c5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:10:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9n6c9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e04b766ff2b4eb366ace7cbf978ef84bfa922494ec3d28f4cbd75efe1cb54f1e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:10:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9n6c9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7630d2e046fcee78924010a8de1aefcdf7c24ef4e70c4ea89c193983a0d88a61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:10:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9n6c9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://59cf50a579cdd65a2981d9c24099f85b6f8708295e48f9ce8e81c9b2999e6189\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://59cf50a579cdd65a2981d9c24099f85b6f8708295e48f9ce8e81c9b2999e6189\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-24T10:10:22Z\\\",\\\"message\\\":\\\"or removal\\\\nI0224 10:10:22.971639 6685 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0224 10:10:22.971646 6685 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0224 10:10:22.971657 6685 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0224 10:10:22.971660 6685 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0224 10:10:22.971677 6685 handler.go:208] Removed *v1.Node event handler 2\\\\nI0224 10:10:22.971697 6685 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0224 10:10:22.971684 6685 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0224 10:10:22.971708 6685 handler.go:208] Removed *v1.Node event handler 7\\\\nI0224 10:10:22.971717 6685 factory.go:656] Stopping watch factory\\\\nI0224 10:10:22.971725 6685 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0224 10:10:22.971736 6685 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0224 10:10:22.971742 6685 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0224 10:10:22.971684 6685 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0224 10:10:22.971776 6685 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0224 10:10:22.972119 6685 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/f\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-24T10:10:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9n6c9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e3faa6c1361aca1d102b6dbfaf38fa3094a5c2334713ce70a8bf2f58f19d6392\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:10:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9n6c9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8de70126733b3634f4e3c1576c3095c089e9628a6147ce8ea79fa5242a2eb920\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8de70126733b3634f4e3c1576c3095c089e9628a6147ce8ea79fa5242a2eb920\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T10:10:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9n6c9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:10:13Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-27dpt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:10:23Z is after 2025-08-24T17:21:41Z" Feb 24 10:10:23 crc kubenswrapper[4985]: I0224 10:10:23.863990 4985 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-qtqms" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c43e3252-1b22-48a0-8895-ad98fc88b7cf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://778506fd4d9af7b6ebd4e4d46282938146ba0184d9747ca657957c78118c10e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:10:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jdldr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:10:13Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-qtqms\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:10:23Z is after 2025-08-24T17:21:41Z" Feb 24 10:10:23 crc kubenswrapper[4985]: I0224 10:10:23.876193 4985 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:10:23Z is after 2025-08-24T17:21:41Z" Feb 24 10:10:23 crc kubenswrapper[4985]: I0224 10:10:23.887714 4985 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:10:23Z is after 2025-08-24T17:21:41Z" Feb 24 10:10:23 crc kubenswrapper[4985]: I0224 10:10:23.902841 4985 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-q24bf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"731349d2-7b07-4bc9-81f8-c7d75bca842a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://85e965ad144b3440864b3c9de70aacd2a2e49c715dca204b0e4cfae65954cd9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:10:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5g9sh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:10:13Z\\\"}}\" for pod \"openshift-multus\"/\"multus-q24bf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:10:23Z is after 2025-08-24T17:21:41Z" Feb 24 10:10:23 crc kubenswrapper[4985]: I0224 10:10:23.912481 4985 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-kkmm8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c3a34c00-910b-400b-bc96-9d805e076b7c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://65682d7b02ccbc0c344ab0b8eaf55d227d0b6e4a04e56d1038d51aa30747b068\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:10:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2zg8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://52f2dbaf1dfa5fd40bbf8a0d82613a96b05afccc71653f6d1f24db6674950b58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:10:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2zg8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:10:13Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-kkmm8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:10:23Z is after 2025-08-24T17:21:41Z" Feb 24 10:10:23 crc kubenswrapper[4985]: I0224 10:10:23.928794 4985 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a6395bdd-0d8e-4572-ae86-695e87aff12e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:08:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:08:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:08:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:08:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:08:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f9134e7cde3d6c701f928934ed4de91ee13bac8bc62dde5df6fe535a219fb57\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:08:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba53e3e1dfd92b0155e60254f21e53e06da9929560934d7bd95624f4a2bb619c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:08:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d059ea80117284da8c7eb82e4cd878eabba8f9d96102965708b9f6e0f8e96eb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:08:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dea2225b9a0112117f2f0bd6038ef636ac20e0cfcf3d608b720ad7aa2cacfa7c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6ee8b871ab25b79ee67f4e5735673469d19cfb336696170957d2439b0bca13af\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-24T10:09:32Z\\\",\\\"message\\\":\\\"le observer\\\\nW0224 10:09:31.800132 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0224 10:09:31.800281 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0224 10:09:31.801147 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1923306337/tls.crt::/tmp/serving-cert-1923306337/tls.key\\\\\\\"\\\\nI0224 10:09:31.990905 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0224 10:09:31.993717 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0224 10:09:31.993747 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0224 10:09:31.993768 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0224 10:09:31.993775 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0224 10:09:31.997392 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0224 10:09:31.997421 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0224 10:09:31.997427 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0224 10:09:31.997436 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nI0224 10:09:31.997435 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0224 10:09:31.997440 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0224 10:09:31.997467 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0224 10:09:31.997470 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0224 10:09:31.999573 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-24T10:09:31Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:10:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5cf6a4886c046added0a4380f1ca023def94c15e9d5c0ed472c714632684f158\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:08:39Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8da8e0d5478ae714a89c61cf4ca4142bc433bca4d4e4749f069c706abfed8c8f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8da8e0d5478ae714a89c61cf4ca4142bc433bca4d4e4749f069c706abfed8c8f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T10:08:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T10:08:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:08:36Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:10:23Z is after 2025-08-24T17:21:41Z" Feb 24 10:10:23 crc kubenswrapper[4985]: I0224 10:10:23.939350 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:10:23 crc kubenswrapper[4985]: I0224 10:10:23.939381 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:10:23 crc kubenswrapper[4985]: I0224 10:10:23.939393 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:10:23 crc kubenswrapper[4985]: I0224 10:10:23.939408 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 10:10:23 crc kubenswrapper[4985]: I0224 10:10:23.939420 4985 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T10:10:23Z","lastTransitionTime":"2026-02-24T10:10:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 10:10:23 crc kubenswrapper[4985]: I0224 10:10:23.943098 4985 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-xj9h5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c4bc690c-38e6-488d-97b2-9bf37a916fe4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ab4787f21b4cef49e96547a1d1e7105c79d01d7863afd7ea013f9a38cb40dd06\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:10:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svj5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5aa381e3ea366641ef4fa423fdd46f8afb44b1b2a2ccb6c754007456149b75e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5aa381e3ea366641ef4fa423fdd46f8afb44b1b2a2ccb6c754007456149b75e6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T10:10:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svj5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e071fe863c4e9eb6ebecb3d58d1c4a3c66577cbe46568a4d80a5de5663636539\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e071fe863c4e9eb6ebecb3d58d1c4a3c66577cbe46568a4d80a5de5663636539\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T10:10:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T10:10:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svj5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ac924a3bb0825085e397dbc788641894675e9556fd8aeadcbf9a24a65e1a076\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4ac924a3bb0825085e397dbc788641894675e9556fd8aeadcbf9a24a65e1a076\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T10:10:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T10:10:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svj5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://91acce3dbd19205b8faf23208b7d65c0c47d8156eaf6d7d3357ce1ca64a3bb88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://91acce3dbd19205b8faf23208b7d65c0c47d8156eaf6d7d3357ce1ca64a3bb88\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T10:10:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T10:10:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svj5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://81482396eb82b6ef2f09d03cf5e6bc785209394a50c63b199581fc70120bb9fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://81482396eb82b6ef2f09d03cf5e6bc785209394a50c63b199581fc70120bb9fd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T10:10:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T10:10:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svj5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://556b84b0e9e72b0c26c58defa2c030943daffa83d97b60293cc5b5627a6c9174\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://556b84b0e9e72b0c26c58defa2c030943daffa83d97b60293cc5b5627a6c9174\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T10:10:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T10:10:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svj5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:10:13Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-xj9h5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:10:23Z is after 2025-08-24T17:21:41Z" Feb 24 10:10:23 crc kubenswrapper[4985]: I0224 10:10:23.953075 4985 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-hq52w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"11c1c7b8-18df-4583-849f-76b62544344b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8546cf975b145528b7f94bd03c6570de0db1a059477da5795b7569250bde3ca5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:10:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kcfbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://14cb7316040a9cb3cb3236ed16ecc17eee99c6935a646b86604ea8285e1aab0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:10:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kcfbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:10:13Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-hq52w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:10:23Z is after 2025-08-24T17:21:41Z" Feb 24 10:10:23 crc kubenswrapper[4985]: I0224 10:10:23.962875 4985 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-jj7jq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4776cd42-9a5c-4c2e-a585-a1f4a49d2d6d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6f45d6400640ba9b59793abc9aee79bb1c32439ff377abdf8b4ec4d1cd7c336a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:10:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rvmf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:10:13Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-jj7jq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:10:23Z is after 2025-08-24T17:21:41Z" Feb 24 10:10:23 crc kubenswrapper[4985]: I0224 10:10:23.977921 4985 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:10:23Z is after 2025-08-24T17:21:41Z" Feb 24 10:10:23 crc kubenswrapper[4985]: I0224 10:10:23.991993 4985 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://137c9b42aa2109e16d0e273e3e7d116db30ed716c1e6f78fb8f4e87409e3f252\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:10:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://40736fa6ee59064a7fc5bed9b08671d094e7a11b6a8bf4ec45220a0d62156542\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:10:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:10:23Z is after 2025-08-24T17:21:41Z" Feb 24 10:10:24 crc kubenswrapper[4985]: I0224 10:10:24.042037 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:10:24 crc kubenswrapper[4985]: I0224 10:10:24.042074 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:10:24 crc kubenswrapper[4985]: I0224 10:10:24.042084 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:10:24 crc kubenswrapper[4985]: I0224 10:10:24.042098 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 10:10:24 crc kubenswrapper[4985]: I0224 10:10:24.042108 4985 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T10:10:24Z","lastTransitionTime":"2026-02-24T10:10:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 10:10:24 crc kubenswrapper[4985]: I0224 10:10:24.144787 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:10:24 crc kubenswrapper[4985]: I0224 10:10:24.144834 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:10:24 crc kubenswrapper[4985]: I0224 10:10:24.144846 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:10:24 crc kubenswrapper[4985]: I0224 10:10:24.144864 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 10:10:24 crc kubenswrapper[4985]: I0224 10:10:24.144878 4985 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T10:10:24Z","lastTransitionTime":"2026-02-24T10:10:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 10:10:24 crc kubenswrapper[4985]: I0224 10:10:24.248243 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:10:24 crc kubenswrapper[4985]: I0224 10:10:24.248307 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:10:24 crc kubenswrapper[4985]: I0224 10:10:24.248325 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:10:24 crc kubenswrapper[4985]: I0224 10:10:24.248348 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 10:10:24 crc kubenswrapper[4985]: I0224 10:10:24.248369 4985 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T10:10:24Z","lastTransitionTime":"2026-02-24T10:10:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 10:10:24 crc kubenswrapper[4985]: I0224 10:10:24.350950 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:10:24 crc kubenswrapper[4985]: I0224 10:10:24.350998 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:10:24 crc kubenswrapper[4985]: I0224 10:10:24.351013 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:10:24 crc kubenswrapper[4985]: I0224 10:10:24.351032 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 10:10:24 crc kubenswrapper[4985]: I0224 10:10:24.351044 4985 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T10:10:24Z","lastTransitionTime":"2026-02-24T10:10:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 10:10:24 crc kubenswrapper[4985]: I0224 10:10:24.453552 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:10:24 crc kubenswrapper[4985]: I0224 10:10:24.453590 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:10:24 crc kubenswrapper[4985]: I0224 10:10:24.453601 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:10:24 crc kubenswrapper[4985]: I0224 10:10:24.453619 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 10:10:24 crc kubenswrapper[4985]: I0224 10:10:24.453630 4985 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T10:10:24Z","lastTransitionTime":"2026-02-24T10:10:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 10:10:24 crc kubenswrapper[4985]: I0224 10:10:24.555834 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:10:24 crc kubenswrapper[4985]: I0224 10:10:24.555871 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:10:24 crc kubenswrapper[4985]: I0224 10:10:24.555880 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:10:24 crc kubenswrapper[4985]: I0224 10:10:24.555907 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 10:10:24 crc kubenswrapper[4985]: I0224 10:10:24.555919 4985 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T10:10:24Z","lastTransitionTime":"2026-02-24T10:10:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 10:10:24 crc kubenswrapper[4985]: I0224 10:10:24.658251 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:10:24 crc kubenswrapper[4985]: I0224 10:10:24.658296 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:10:24 crc kubenswrapper[4985]: I0224 10:10:24.658308 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:10:24 crc kubenswrapper[4985]: I0224 10:10:24.658325 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 10:10:24 crc kubenswrapper[4985]: I0224 10:10:24.658337 4985 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T10:10:24Z","lastTransitionTime":"2026-02-24T10:10:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 10:10:24 crc kubenswrapper[4985]: I0224 10:10:24.761313 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:10:24 crc kubenswrapper[4985]: I0224 10:10:24.761368 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:10:24 crc kubenswrapper[4985]: I0224 10:10:24.761379 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:10:24 crc kubenswrapper[4985]: I0224 10:10:24.761396 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 10:10:24 crc kubenswrapper[4985]: I0224 10:10:24.761409 4985 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T10:10:24Z","lastTransitionTime":"2026-02-24T10:10:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 10:10:24 crc kubenswrapper[4985]: I0224 10:10:24.787029 4985 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-27dpt_1b3986ef-e9be-43db-9350-ccc7dd3f713f/ovnkube-controller/0.log" Feb 24 10:10:24 crc kubenswrapper[4985]: I0224 10:10:24.790930 4985 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-27dpt" event={"ID":"1b3986ef-e9be-43db-9350-ccc7dd3f713f","Type":"ContainerStarted","Data":"374d551e69c9bf5e4a9f9cc851abb3d18629f675d0be6f68664926f579bb45fe"} Feb 24 10:10:24 crc kubenswrapper[4985]: I0224 10:10:24.791679 4985 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-27dpt" Feb 24 10:10:24 crc kubenswrapper[4985]: I0224 10:10:24.807070 4985 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:10:24Z is after 2025-08-24T17:21:41Z" Feb 24 10:10:24 crc kubenswrapper[4985]: I0224 10:10:24.819833 4985 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:10:24Z is after 2025-08-24T17:21:41Z" Feb 24 10:10:24 crc kubenswrapper[4985]: I0224 10:10:24.831179 4985 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-q24bf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"731349d2-7b07-4bc9-81f8-c7d75bca842a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://85e965ad144b3440864b3c9de70aacd2a2e49c715dca204b0e4cfae65954cd9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:10:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5g9sh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:10:13Z\\\"}}\" for pod \"openshift-multus\"/\"multus-q24bf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:10:24Z is after 2025-08-24T17:21:41Z" Feb 24 10:10:24 crc kubenswrapper[4985]: I0224 10:10:24.840047 4985 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-kkmm8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c3a34c00-910b-400b-bc96-9d805e076b7c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://65682d7b02ccbc0c344ab0b8eaf55d227d0b6e4a04e56d1038d51aa30747b068\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:10:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2zg8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://52f2dbaf1dfa5fd40bbf8a0d82613a96b05afccc71653f6d1f24db6674950b58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:10:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2zg8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:10:13Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-kkmm8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:10:24Z is after 2025-08-24T17:21:41Z" Feb 24 10:10:24 crc kubenswrapper[4985]: I0224 10:10:24.851098 4985 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a6395bdd-0d8e-4572-ae86-695e87aff12e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:08:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:08:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:08:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:08:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:08:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f9134e7cde3d6c701f928934ed4de91ee13bac8bc62dde5df6fe535a219fb57\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:08:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba53e3e1dfd92b0155e60254f21e53e06da9929560934d7bd95624f4a2bb619c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:08:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d059ea80117284da8c7eb82e4cd878eabba8f9d96102965708b9f6e0f8e96eb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:08:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dea2225b9a0112117f2f0bd6038ef636ac20e0cfcf3d608b720ad7aa2cacfa7c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6ee8b871ab25b79ee67f4e5735673469d19cfb336696170957d2439b0bca13af\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-24T10:09:32Z\\\",\\\"message\\\":\\\"le observer\\\\nW0224 10:09:31.800132 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0224 10:09:31.800281 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0224 10:09:31.801147 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1923306337/tls.crt::/tmp/serving-cert-1923306337/tls.key\\\\\\\"\\\\nI0224 10:09:31.990905 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0224 10:09:31.993717 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0224 10:09:31.993747 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0224 10:09:31.993768 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0224 10:09:31.993775 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0224 10:09:31.997392 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0224 10:09:31.997421 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0224 10:09:31.997427 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0224 10:09:31.997436 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nI0224 10:09:31.997435 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0224 10:09:31.997440 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0224 10:09:31.997467 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0224 10:09:31.997470 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0224 10:09:31.999573 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-24T10:09:31Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:10:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5cf6a4886c046added0a4380f1ca023def94c15e9d5c0ed472c714632684f158\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:08:39Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8da8e0d5478ae714a89c61cf4ca4142bc433bca4d4e4749f069c706abfed8c8f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8da8e0d5478ae714a89c61cf4ca4142bc433bca4d4e4749f069c706abfed8c8f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T10:08:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T10:08:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:08:36Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:10:24Z is after 2025-08-24T17:21:41Z" Feb 24 10:10:24 crc kubenswrapper[4985]: I0224 10:10:24.863687 4985 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-xj9h5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c4bc690c-38e6-488d-97b2-9bf37a916fe4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ab4787f21b4cef49e96547a1d1e7105c79d01d7863afd7ea013f9a38cb40dd06\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:10:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svj5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5aa381e3ea366641ef4fa423fdd46f8afb44b1b2a2ccb6c754007456149b75e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5aa381e3ea366641ef4fa423fdd46f8afb44b1b2a2ccb6c754007456149b75e6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T10:10:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svj5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e071fe863c4e9eb6ebecb3d58d1c4a3c66577cbe46568a4d80a5de5663636539\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e071fe863c4e9eb6ebecb3d58d1c4a3c66577cbe46568a4d80a5de5663636539\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T10:10:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T10:10:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svj5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ac924a3bb0825085e397dbc788641894675e9556fd8aeadcbf9a24a65e1a076\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4ac924a3bb0825085e397dbc788641894675e9556fd8aeadcbf9a24a65e1a076\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T10:10:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T10:10:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svj5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://91acce3dbd19205b8faf23208b7d65c0c47d8156eaf6d7d3357ce1ca64a3bb88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://91acce3dbd19205b8faf23208b7d65c0c47d8156eaf6d7d3357ce1ca64a3bb88\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T10:10:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T10:10:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svj5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://81482396eb82b6ef2f09d03cf5e6bc785209394a50c63b199581fc70120bb9fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://81482396eb82b6ef2f09d03cf5e6bc785209394a50c63b199581fc70120bb9fd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T10:10:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T10:10:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svj5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://556b84b0e9e72b0c26c58defa2c030943daffa83d97b60293cc5b5627a6c9174\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://556b84b0e9e72b0c26c58defa2c030943daffa83d97b60293cc5b5627a6c9174\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T10:10:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T10:10:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svj5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:10:13Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-xj9h5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:10:24Z is after 2025-08-24T17:21:41Z" Feb 24 10:10:24 crc kubenswrapper[4985]: I0224 10:10:24.864464 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:10:24 crc kubenswrapper[4985]: I0224 10:10:24.864502 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:10:24 crc kubenswrapper[4985]: I0224 10:10:24.864516 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:10:24 crc kubenswrapper[4985]: I0224 10:10:24.864537 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 10:10:24 crc kubenswrapper[4985]: I0224 10:10:24.864550 4985 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T10:10:24Z","lastTransitionTime":"2026-02-24T10:10:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 10:10:24 crc kubenswrapper[4985]: I0224 10:10:24.873255 4985 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-hq52w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"11c1c7b8-18df-4583-849f-76b62544344b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8546cf975b145528b7f94bd03c6570de0db1a059477da5795b7569250bde3ca5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:10:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kcfbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://14cb7316040a9cb3cb3236ed16ecc17eee99c6935a646b86604ea8285e1aab0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:10:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kcfbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:10:13Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-hq52w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:10:24Z is after 2025-08-24T17:21:41Z" Feb 24 10:10:24 crc kubenswrapper[4985]: I0224 10:10:24.881649 4985 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-jj7jq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4776cd42-9a5c-4c2e-a585-a1f4a49d2d6d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6f45d6400640ba9b59793abc9aee79bb1c32439ff377abdf8b4ec4d1cd7c336a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:10:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rvmf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:10:13Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-jj7jq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:10:24Z is after 2025-08-24T17:21:41Z" Feb 24 10:10:24 crc kubenswrapper[4985]: I0224 10:10:24.891413 4985 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:10:24Z is after 2025-08-24T17:21:41Z" Feb 24 10:10:24 crc kubenswrapper[4985]: I0224 10:10:24.901475 4985 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://137c9b42aa2109e16d0e273e3e7d116db30ed716c1e6f78fb8f4e87409e3f252\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:10:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://40736fa6ee59064a7fc5bed9b08671d094e7a11b6a8bf4ec45220a0d62156542\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:10:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:10:24Z is after 2025-08-24T17:21:41Z" Feb 24 10:10:24 crc kubenswrapper[4985]: I0224 10:10:24.910546 4985 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-qtqms" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c43e3252-1b22-48a0-8895-ad98fc88b7cf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://778506fd4d9af7b6ebd4e4d46282938146ba0184d9747ca657957c78118c10e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:10:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jdldr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:10:13Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-qtqms\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:10:24Z is after 2025-08-24T17:21:41Z" Feb 24 10:10:24 crc kubenswrapper[4985]: I0224 10:10:24.922182 4985 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c9e0873cd1521fe3cd14005f22118db4616043670f4b895435e71ddec9d82a0e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:10:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:10:24Z is after 2025-08-24T17:21:41Z" Feb 24 10:10:24 crc kubenswrapper[4985]: I0224 10:10:24.931258 4985 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:16Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:16Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e2f9d78f6a52c55d1f70fa119f84fdb11505d511d394a8ed61a298e79dfb2dcd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:10:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:10:24Z is after 2025-08-24T17:21:41Z" Feb 24 10:10:24 crc kubenswrapper[4985]: I0224 10:10:24.940224 4985 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-xkc65" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d4340d1a-60cb-4240-87ba-1e468c9c41cf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lvcfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lvcfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:10:13Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-xkc65\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:10:24Z is after 2025-08-24T17:21:41Z" Feb 24 10:10:24 crc kubenswrapper[4985]: I0224 10:10:24.956632 4985 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-27dpt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1b3986ef-e9be-43db-9350-ccc7dd3f713f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://97f91c2ddd00ceb58ea5691c7cb057bd7cfe8add5fd1dfd334f39b9def30db0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:10:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9n6c9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://af211a7b5dbd1e1f86f340d63afe03ad2f9691d166810837cc5988a24ad4f323\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:10:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9n6c9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3d76cc1e57df7809a67ce69b6690fd8cbe0ef9c364406b03bbb37bd110d65866\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:10:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9n6c9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cb7b5b1f27320e4bc00e48154c3bb41b1153e36c39a404cfc1d4a246f5909c5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:10:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9n6c9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e04b766ff2b4eb366ace7cbf978ef84bfa922494ec3d28f4cbd75efe1cb54f1e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:10:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9n6c9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7630d2e046fcee78924010a8de1aefcdf7c24ef4e70c4ea89c193983a0d88a61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:10:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9n6c9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://374d551e69c9bf5e4a9f9cc851abb3d18629f675d0be6f68664926f579bb45fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://59cf50a579cdd65a2981d9c24099f85b6f8708295e48f9ce8e81c9b2999e6189\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-24T10:10:22Z\\\",\\\"message\\\":\\\"or removal\\\\nI0224 10:10:22.971639 6685 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0224 10:10:22.971646 6685 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0224 10:10:22.971657 6685 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0224 10:10:22.971660 6685 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0224 10:10:22.971677 6685 handler.go:208] Removed *v1.Node event handler 2\\\\nI0224 10:10:22.971697 6685 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0224 10:10:22.971684 6685 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0224 10:10:22.971708 6685 handler.go:208] Removed *v1.Node event handler 7\\\\nI0224 10:10:22.971717 6685 factory.go:656] Stopping watch factory\\\\nI0224 10:10:22.971725 6685 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0224 10:10:22.971736 6685 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0224 10:10:22.971742 6685 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0224 10:10:22.971684 6685 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0224 10:10:22.971776 6685 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0224 10:10:22.972119 6685 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/f\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-24T10:10:20Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:10:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9n6c9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e3faa6c1361aca1d102b6dbfaf38fa3094a5c2334713ce70a8bf2f58f19d6392\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:10:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9n6c9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8de70126733b3634f4e3c1576c3095c089e9628a6147ce8ea79fa5242a2eb920\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8de70126733b3634f4e3c1576c3095c089e9628a6147ce8ea79fa5242a2eb920\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T10:10:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9n6c9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:10:13Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-27dpt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:10:24Z is after 2025-08-24T17:21:41Z" Feb 24 10:10:24 crc kubenswrapper[4985]: I0224 10:10:24.966566 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:10:24 crc kubenswrapper[4985]: I0224 10:10:24.966601 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:10:24 crc kubenswrapper[4985]: I0224 10:10:24.966610 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:10:24 crc kubenswrapper[4985]: I0224 10:10:24.966623 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 10:10:24 crc kubenswrapper[4985]: I0224 10:10:24.966632 4985 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T10:10:24Z","lastTransitionTime":"2026-02-24T10:10:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 10:10:25 crc kubenswrapper[4985]: I0224 10:10:25.069198 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:10:25 crc kubenswrapper[4985]: I0224 10:10:25.069241 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:10:25 crc kubenswrapper[4985]: I0224 10:10:25.069252 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:10:25 crc kubenswrapper[4985]: I0224 10:10:25.069269 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 10:10:25 crc kubenswrapper[4985]: I0224 10:10:25.069289 4985 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T10:10:25Z","lastTransitionTime":"2026-02-24T10:10:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 10:10:25 crc kubenswrapper[4985]: I0224 10:10:25.171280 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:10:25 crc kubenswrapper[4985]: I0224 10:10:25.171319 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:10:25 crc kubenswrapper[4985]: I0224 10:10:25.171328 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:10:25 crc kubenswrapper[4985]: I0224 10:10:25.171344 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 10:10:25 crc kubenswrapper[4985]: I0224 10:10:25.171356 4985 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T10:10:25Z","lastTransitionTime":"2026-02-24T10:10:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 10:10:25 crc kubenswrapper[4985]: I0224 10:10:25.263547 4985 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 24 10:10:25 crc kubenswrapper[4985]: I0224 10:10:25.263616 4985 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 24 10:10:25 crc kubenswrapper[4985]: I0224 10:10:25.263644 4985 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xkc65" Feb 24 10:10:25 crc kubenswrapper[4985]: I0224 10:10:25.263686 4985 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 24 10:10:25 crc kubenswrapper[4985]: E0224 10:10:25.263802 4985 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 24 10:10:25 crc kubenswrapper[4985]: E0224 10:10:25.263949 4985 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xkc65" podUID="d4340d1a-60cb-4240-87ba-1e468c9c41cf" Feb 24 10:10:25 crc kubenswrapper[4985]: E0224 10:10:25.264079 4985 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 24 10:10:25 crc kubenswrapper[4985]: E0224 10:10:25.264214 4985 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 24 10:10:25 crc kubenswrapper[4985]: I0224 10:10:25.274071 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:10:25 crc kubenswrapper[4985]: I0224 10:10:25.274418 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:10:25 crc kubenswrapper[4985]: I0224 10:10:25.274447 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:10:25 crc kubenswrapper[4985]: I0224 10:10:25.274470 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 10:10:25 crc kubenswrapper[4985]: I0224 10:10:25.274486 4985 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T10:10:25Z","lastTransitionTime":"2026-02-24T10:10:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 10:10:25 crc kubenswrapper[4985]: I0224 10:10:25.377808 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:10:25 crc kubenswrapper[4985]: I0224 10:10:25.377884 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:10:25 crc kubenswrapper[4985]: I0224 10:10:25.377923 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:10:25 crc kubenswrapper[4985]: I0224 10:10:25.377950 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 10:10:25 crc kubenswrapper[4985]: I0224 10:10:25.377966 4985 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T10:10:25Z","lastTransitionTime":"2026-02-24T10:10:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 10:10:25 crc kubenswrapper[4985]: I0224 10:10:25.480396 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:10:25 crc kubenswrapper[4985]: I0224 10:10:25.480433 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:10:25 crc kubenswrapper[4985]: I0224 10:10:25.480442 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:10:25 crc kubenswrapper[4985]: I0224 10:10:25.480456 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 10:10:25 crc kubenswrapper[4985]: I0224 10:10:25.480464 4985 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T10:10:25Z","lastTransitionTime":"2026-02-24T10:10:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 10:10:25 crc kubenswrapper[4985]: I0224 10:10:25.583221 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:10:25 crc kubenswrapper[4985]: I0224 10:10:25.583278 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:10:25 crc kubenswrapper[4985]: I0224 10:10:25.583291 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:10:25 crc kubenswrapper[4985]: I0224 10:10:25.583311 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 10:10:25 crc kubenswrapper[4985]: I0224 10:10:25.583324 4985 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T10:10:25Z","lastTransitionTime":"2026-02-24T10:10:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 10:10:25 crc kubenswrapper[4985]: I0224 10:10:25.686214 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:10:25 crc kubenswrapper[4985]: I0224 10:10:25.686296 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:10:25 crc kubenswrapper[4985]: I0224 10:10:25.686307 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:10:25 crc kubenswrapper[4985]: I0224 10:10:25.686323 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 10:10:25 crc kubenswrapper[4985]: I0224 10:10:25.686335 4985 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T10:10:25Z","lastTransitionTime":"2026-02-24T10:10:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 10:10:25 crc kubenswrapper[4985]: I0224 10:10:25.789197 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:10:25 crc kubenswrapper[4985]: I0224 10:10:25.789249 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:10:25 crc kubenswrapper[4985]: I0224 10:10:25.789262 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:10:25 crc kubenswrapper[4985]: I0224 10:10:25.789277 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 10:10:25 crc kubenswrapper[4985]: I0224 10:10:25.789290 4985 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T10:10:25Z","lastTransitionTime":"2026-02-24T10:10:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 10:10:25 crc kubenswrapper[4985]: I0224 10:10:25.796132 4985 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-27dpt_1b3986ef-e9be-43db-9350-ccc7dd3f713f/ovnkube-controller/1.log" Feb 24 10:10:25 crc kubenswrapper[4985]: I0224 10:10:25.796689 4985 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-27dpt_1b3986ef-e9be-43db-9350-ccc7dd3f713f/ovnkube-controller/0.log" Feb 24 10:10:25 crc kubenswrapper[4985]: I0224 10:10:25.800269 4985 generic.go:334] "Generic (PLEG): container finished" podID="1b3986ef-e9be-43db-9350-ccc7dd3f713f" containerID="374d551e69c9bf5e4a9f9cc851abb3d18629f675d0be6f68664926f579bb45fe" exitCode=1 Feb 24 10:10:25 crc kubenswrapper[4985]: I0224 10:10:25.800332 4985 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-27dpt" event={"ID":"1b3986ef-e9be-43db-9350-ccc7dd3f713f","Type":"ContainerDied","Data":"374d551e69c9bf5e4a9f9cc851abb3d18629f675d0be6f68664926f579bb45fe"} Feb 24 10:10:25 crc kubenswrapper[4985]: I0224 10:10:25.800371 4985 scope.go:117] "RemoveContainer" containerID="59cf50a579cdd65a2981d9c24099f85b6f8708295e48f9ce8e81c9b2999e6189" Feb 24 10:10:25 crc kubenswrapper[4985]: I0224 10:10:25.801984 4985 scope.go:117] "RemoveContainer" containerID="374d551e69c9bf5e4a9f9cc851abb3d18629f675d0be6f68664926f579bb45fe" Feb 24 10:10:25 crc kubenswrapper[4985]: E0224 10:10:25.802369 4985 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-27dpt_openshift-ovn-kubernetes(1b3986ef-e9be-43db-9350-ccc7dd3f713f)\"" pod="openshift-ovn-kubernetes/ovnkube-node-27dpt" podUID="1b3986ef-e9be-43db-9350-ccc7dd3f713f" Feb 24 10:10:25 crc kubenswrapper[4985]: I0224 10:10:25.813557 4985 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-xkc65" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d4340d1a-60cb-4240-87ba-1e468c9c41cf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lvcfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lvcfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:10:13Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-xkc65\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:10:25Z is after 2025-08-24T17:21:41Z" Feb 24 10:10:25 crc kubenswrapper[4985]: I0224 10:10:25.832857 4985 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-27dpt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1b3986ef-e9be-43db-9350-ccc7dd3f713f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://97f91c2ddd00ceb58ea5691c7cb057bd7cfe8add5fd1dfd334f39b9def30db0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:10:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9n6c9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://af211a7b5dbd1e1f86f340d63afe03ad2f9691d166810837cc5988a24ad4f323\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:10:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9n6c9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3d76cc1e57df7809a67ce69b6690fd8cbe0ef9c364406b03bbb37bd110d65866\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:10:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9n6c9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cb7b5b1f27320e4bc00e48154c3bb41b1153e36c39a404cfc1d4a246f5909c5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:10:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9n6c9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e04b766ff2b4eb366ace7cbf978ef84bfa922494ec3d28f4cbd75efe1cb54f1e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:10:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9n6c9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7630d2e046fcee78924010a8de1aefcdf7c24ef4e70c4ea89c193983a0d88a61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:10:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9n6c9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://374d551e69c9bf5e4a9f9cc851abb3d18629f675d0be6f68664926f579bb45fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://59cf50a579cdd65a2981d9c24099f85b6f8708295e48f9ce8e81c9b2999e6189\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-24T10:10:22Z\\\",\\\"message\\\":\\\"or removal\\\\nI0224 10:10:22.971639 6685 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0224 10:10:22.971646 6685 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0224 10:10:22.971657 6685 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0224 10:10:22.971660 6685 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0224 10:10:22.971677 6685 handler.go:208] Removed *v1.Node event handler 2\\\\nI0224 10:10:22.971697 6685 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0224 10:10:22.971684 6685 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0224 10:10:22.971708 6685 handler.go:208] Removed *v1.Node event handler 7\\\\nI0224 10:10:22.971717 6685 factory.go:656] Stopping watch factory\\\\nI0224 10:10:22.971725 6685 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0224 10:10:22.971736 6685 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0224 10:10:22.971742 6685 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0224 10:10:22.971684 6685 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0224 10:10:22.971776 6685 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0224 10:10:22.972119 6685 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/f\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-24T10:10:20Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://374d551e69c9bf5e4a9f9cc851abb3d18629f675d0be6f68664926f579bb45fe\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-24T10:10:24Z\\\",\\\"message\\\":\\\"772757 6847 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0224 10:10:24.772762 6847 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0224 10:10:24.772771 6847 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0224 10:10:24.772784 6847 factory.go:656] Stopping watch factory\\\\nI0224 10:10:24.772785 6847 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0224 10:10:24.772793 6847 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI0224 10:10:24.772811 6847 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0224 10:10:24.772881 6847 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI0224 10:10:24.772946 6847 ovnkube.go:599] Stopped ovnkube\\\\nI0224 10:10:24.773029 6847 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI0224 10:10:24.773081 6847 loadbalancer.go:304] Deleted 0 stale LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-kube-scheduler/scheduler\\\\\\\"}\\\\nF0224 10:10:24.773103 6847 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: failed to add event handler: handler {0x1e60340 0x1e60020 0x1e5ffc0} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network contr\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-24T10:10:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9n6c9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e3faa6c1361aca1d102b6dbfaf38fa3094a5c2334713ce70a8bf2f58f19d6392\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:10:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9n6c9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8de70126733b3634f4e3c1576c3095c089e9628a6147ce8ea79fa5242a2eb920\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8de70126733b3634f4e3c1576c3095c089e9628a6147ce8ea79fa5242a2eb920\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T10:10:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9n6c9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:10:13Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-27dpt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:10:25Z is after 2025-08-24T17:21:41Z" Feb 24 10:10:25 crc kubenswrapper[4985]: I0224 10:10:25.847225 4985 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-qtqms" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c43e3252-1b22-48a0-8895-ad98fc88b7cf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://778506fd4d9af7b6ebd4e4d46282938146ba0184d9747ca657957c78118c10e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:10:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jdldr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:10:13Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-qtqms\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:10:25Z is after 2025-08-24T17:21:41Z" Feb 24 10:10:25 crc kubenswrapper[4985]: I0224 10:10:25.862223 4985 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c9e0873cd1521fe3cd14005f22118db4616043670f4b895435e71ddec9d82a0e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:10:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:10:25Z is after 2025-08-24T17:21:41Z" Feb 24 10:10:25 crc kubenswrapper[4985]: I0224 10:10:25.878072 4985 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:16Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:16Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e2f9d78f6a52c55d1f70fa119f84fdb11505d511d394a8ed61a298e79dfb2dcd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:10:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:10:25Z is after 2025-08-24T17:21:41Z" Feb 24 10:10:25 crc kubenswrapper[4985]: I0224 10:10:25.889453 4985 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:10:25Z is after 2025-08-24T17:21:41Z" Feb 24 10:10:25 crc kubenswrapper[4985]: I0224 10:10:25.891304 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:10:25 crc kubenswrapper[4985]: I0224 10:10:25.891333 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:10:25 crc kubenswrapper[4985]: I0224 10:10:25.891341 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:10:25 crc kubenswrapper[4985]: I0224 10:10:25.891356 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 10:10:25 crc kubenswrapper[4985]: I0224 10:10:25.891366 4985 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T10:10:25Z","lastTransitionTime":"2026-02-24T10:10:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 10:10:25 crc kubenswrapper[4985]: I0224 10:10:25.907157 4985 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:10:25Z is after 2025-08-24T17:21:41Z" Feb 24 10:10:25 crc kubenswrapper[4985]: I0224 10:10:25.923478 4985 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-q24bf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"731349d2-7b07-4bc9-81f8-c7d75bca842a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://85e965ad144b3440864b3c9de70aacd2a2e49c715dca204b0e4cfae65954cd9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:10:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5g9sh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:10:13Z\\\"}}\" for pod \"openshift-multus\"/\"multus-q24bf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:10:25Z is after 2025-08-24T17:21:41Z" Feb 24 10:10:25 crc kubenswrapper[4985]: I0224 10:10:25.935877 4985 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-kkmm8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c3a34c00-910b-400b-bc96-9d805e076b7c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://65682d7b02ccbc0c344ab0b8eaf55d227d0b6e4a04e56d1038d51aa30747b068\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:10:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2zg8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://52f2dbaf1dfa5fd40bbf8a0d82613a96b05afccc71653f6d1f24db6674950b58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:10:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2zg8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:10:13Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-kkmm8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:10:25Z is after 2025-08-24T17:21:41Z" Feb 24 10:10:25 crc kubenswrapper[4985]: I0224 10:10:25.950566 4985 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a6395bdd-0d8e-4572-ae86-695e87aff12e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:08:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:08:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:08:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:08:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:08:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f9134e7cde3d6c701f928934ed4de91ee13bac8bc62dde5df6fe535a219fb57\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:08:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba53e3e1dfd92b0155e60254f21e53e06da9929560934d7bd95624f4a2bb619c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:08:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d059ea80117284da8c7eb82e4cd878eabba8f9d96102965708b9f6e0f8e96eb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:08:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dea2225b9a0112117f2f0bd6038ef636ac20e0cfcf3d608b720ad7aa2cacfa7c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6ee8b871ab25b79ee67f4e5735673469d19cfb336696170957d2439b0bca13af\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-24T10:09:32Z\\\",\\\"message\\\":\\\"le observer\\\\nW0224 10:09:31.800132 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0224 10:09:31.800281 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0224 10:09:31.801147 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1923306337/tls.crt::/tmp/serving-cert-1923306337/tls.key\\\\\\\"\\\\nI0224 10:09:31.990905 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0224 10:09:31.993717 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0224 10:09:31.993747 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0224 10:09:31.993768 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0224 10:09:31.993775 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0224 10:09:31.997392 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0224 10:09:31.997421 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0224 10:09:31.997427 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0224 10:09:31.997436 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nI0224 10:09:31.997435 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0224 10:09:31.997440 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0224 10:09:31.997467 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0224 10:09:31.997470 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0224 10:09:31.999573 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-24T10:09:31Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:10:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5cf6a4886c046added0a4380f1ca023def94c15e9d5c0ed472c714632684f158\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:08:39Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8da8e0d5478ae714a89c61cf4ca4142bc433bca4d4e4749f069c706abfed8c8f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8da8e0d5478ae714a89c61cf4ca4142bc433bca4d4e4749f069c706abfed8c8f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T10:08:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T10:08:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:08:36Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:10:25Z is after 2025-08-24T17:21:41Z" Feb 24 10:10:25 crc kubenswrapper[4985]: I0224 10:10:25.965131 4985 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-xj9h5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c4bc690c-38e6-488d-97b2-9bf37a916fe4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ab4787f21b4cef49e96547a1d1e7105c79d01d7863afd7ea013f9a38cb40dd06\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:10:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svj5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5aa381e3ea366641ef4fa423fdd46f8afb44b1b2a2ccb6c754007456149b75e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5aa381e3ea366641ef4fa423fdd46f8afb44b1b2a2ccb6c754007456149b75e6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T10:10:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svj5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e071fe863c4e9eb6ebecb3d58d1c4a3c66577cbe46568a4d80a5de5663636539\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e071fe863c4e9eb6ebecb3d58d1c4a3c66577cbe46568a4d80a5de5663636539\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T10:10:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T10:10:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svj5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ac924a3bb0825085e397dbc788641894675e9556fd8aeadcbf9a24a65e1a076\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4ac924a3bb0825085e397dbc788641894675e9556fd8aeadcbf9a24a65e1a076\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T10:10:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T10:10:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svj5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://91acce3dbd19205b8faf23208b7d65c0c47d8156eaf6d7d3357ce1ca64a3bb88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://91acce3dbd19205b8faf23208b7d65c0c47d8156eaf6d7d3357ce1ca64a3bb88\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T10:10:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T10:10:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svj5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://81482396eb82b6ef2f09d03cf5e6bc785209394a50c63b199581fc70120bb9fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://81482396eb82b6ef2f09d03cf5e6bc785209394a50c63b199581fc70120bb9fd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T10:10:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T10:10:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svj5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://556b84b0e9e72b0c26c58defa2c030943daffa83d97b60293cc5b5627a6c9174\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://556b84b0e9e72b0c26c58defa2c030943daffa83d97b60293cc5b5627a6c9174\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T10:10:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T10:10:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svj5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:10:13Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-xj9h5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:10:25Z is after 2025-08-24T17:21:41Z" Feb 24 10:10:25 crc kubenswrapper[4985]: I0224 10:10:25.977570 4985 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-hq52w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"11c1c7b8-18df-4583-849f-76b62544344b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8546cf975b145528b7f94bd03c6570de0db1a059477da5795b7569250bde3ca5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:10:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kcfbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://14cb7316040a9cb3cb3236ed16ecc17eee99c6935a646b86604ea8285e1aab0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:10:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kcfbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:10:13Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-hq52w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:10:25Z is after 2025-08-24T17:21:41Z" Feb 24 10:10:25 crc kubenswrapper[4985]: I0224 10:10:25.987747 4985 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-jj7jq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4776cd42-9a5c-4c2e-a585-a1f4a49d2d6d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6f45d6400640ba9b59793abc9aee79bb1c32439ff377abdf8b4ec4d1cd7c336a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:10:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rvmf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:10:13Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-jj7jq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:10:25Z is after 2025-08-24T17:21:41Z" Feb 24 10:10:25 crc kubenswrapper[4985]: I0224 10:10:25.992871 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:10:25 crc kubenswrapper[4985]: I0224 10:10:25.992939 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:10:25 crc kubenswrapper[4985]: I0224 10:10:25.992948 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:10:25 crc kubenswrapper[4985]: I0224 10:10:25.992961 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 10:10:25 crc kubenswrapper[4985]: I0224 10:10:25.992970 4985 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T10:10:25Z","lastTransitionTime":"2026-02-24T10:10:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 10:10:26 crc kubenswrapper[4985]: I0224 10:10:26.002255 4985 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:10:25Z is after 2025-08-24T17:21:41Z" Feb 24 10:10:26 crc kubenswrapper[4985]: I0224 10:10:26.013943 4985 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://137c9b42aa2109e16d0e273e3e7d116db30ed716c1e6f78fb8f4e87409e3f252\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:10:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://40736fa6ee59064a7fc5bed9b08671d094e7a11b6a8bf4ec45220a0d62156542\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:10:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:10:26Z is after 2025-08-24T17:21:41Z" Feb 24 10:10:26 crc kubenswrapper[4985]: I0224 10:10:26.096955 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:10:26 crc kubenswrapper[4985]: I0224 10:10:26.097012 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:10:26 crc kubenswrapper[4985]: I0224 10:10:26.097031 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:10:26 crc kubenswrapper[4985]: I0224 10:10:26.097053 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 10:10:26 crc kubenswrapper[4985]: I0224 10:10:26.097072 4985 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T10:10:26Z","lastTransitionTime":"2026-02-24T10:10:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 10:10:26 crc kubenswrapper[4985]: I0224 10:10:26.199467 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:10:26 crc kubenswrapper[4985]: I0224 10:10:26.199514 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:10:26 crc kubenswrapper[4985]: I0224 10:10:26.199530 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:10:26 crc kubenswrapper[4985]: I0224 10:10:26.199551 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 10:10:26 crc kubenswrapper[4985]: I0224 10:10:26.199566 4985 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T10:10:26Z","lastTransitionTime":"2026-02-24T10:10:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 10:10:26 crc kubenswrapper[4985]: I0224 10:10:26.278371 4985 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:10:26Z is after 2025-08-24T17:21:41Z" Feb 24 10:10:26 crc kubenswrapper[4985]: I0224 10:10:26.291206 4985 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:10:26Z is after 2025-08-24T17:21:41Z" Feb 24 10:10:26 crc kubenswrapper[4985]: I0224 10:10:26.302139 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:10:26 crc kubenswrapper[4985]: I0224 10:10:26.302179 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:10:26 crc kubenswrapper[4985]: I0224 10:10:26.302189 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:10:26 crc kubenswrapper[4985]: I0224 10:10:26.302203 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 10:10:26 crc kubenswrapper[4985]: I0224 10:10:26.302213 4985 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T10:10:26Z","lastTransitionTime":"2026-02-24T10:10:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 10:10:26 crc kubenswrapper[4985]: I0224 10:10:26.302859 4985 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-q24bf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"731349d2-7b07-4bc9-81f8-c7d75bca842a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://85e965ad144b3440864b3c9de70aacd2a2e49c715dca204b0e4cfae65954cd9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:10:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5g9sh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:10:13Z\\\"}}\" for pod \"openshift-multus\"/\"multus-q24bf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:10:26Z is after 2025-08-24T17:21:41Z" Feb 24 10:10:26 crc kubenswrapper[4985]: I0224 10:10:26.312646 4985 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-kkmm8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c3a34c00-910b-400b-bc96-9d805e076b7c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://65682d7b02ccbc0c344ab0b8eaf55d227d0b6e4a04e56d1038d51aa30747b068\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:10:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2zg8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://52f2dbaf1dfa5fd40bbf8a0d82613a96b05afccc71653f6d1f24db6674950b58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:10:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2zg8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:10:13Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-kkmm8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:10:26Z is after 2025-08-24T17:21:41Z" Feb 24 10:10:26 crc kubenswrapper[4985]: I0224 10:10:26.335730 4985 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a6395bdd-0d8e-4572-ae86-695e87aff12e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:08:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:08:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:08:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:08:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:08:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f9134e7cde3d6c701f928934ed4de91ee13bac8bc62dde5df6fe535a219fb57\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:08:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba53e3e1dfd92b0155e60254f21e53e06da9929560934d7bd95624f4a2bb619c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:08:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d059ea80117284da8c7eb82e4cd878eabba8f9d96102965708b9f6e0f8e96eb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:08:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dea2225b9a0112117f2f0bd6038ef636ac20e0cfcf3d608b720ad7aa2cacfa7c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6ee8b871ab25b79ee67f4e5735673469d19cfb336696170957d2439b0bca13af\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-24T10:09:32Z\\\",\\\"message\\\":\\\"le observer\\\\nW0224 10:09:31.800132 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0224 10:09:31.800281 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0224 10:09:31.801147 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1923306337/tls.crt::/tmp/serving-cert-1923306337/tls.key\\\\\\\"\\\\nI0224 10:09:31.990905 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0224 10:09:31.993717 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0224 10:09:31.993747 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0224 10:09:31.993768 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0224 10:09:31.993775 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0224 10:09:31.997392 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0224 10:09:31.997421 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0224 10:09:31.997427 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0224 10:09:31.997436 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nI0224 10:09:31.997435 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0224 10:09:31.997440 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0224 10:09:31.997467 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0224 10:09:31.997470 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0224 10:09:31.999573 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-24T10:09:31Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:10:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5cf6a4886c046added0a4380f1ca023def94c15e9d5c0ed472c714632684f158\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:08:39Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8da8e0d5478ae714a89c61cf4ca4142bc433bca4d4e4749f069c706abfed8c8f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8da8e0d5478ae714a89c61cf4ca4142bc433bca4d4e4749f069c706abfed8c8f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T10:08:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T10:08:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:08:36Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:10:26Z is after 2025-08-24T17:21:41Z" Feb 24 10:10:26 crc kubenswrapper[4985]: I0224 10:10:26.356030 4985 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-xj9h5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c4bc690c-38e6-488d-97b2-9bf37a916fe4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ab4787f21b4cef49e96547a1d1e7105c79d01d7863afd7ea013f9a38cb40dd06\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:10:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svj5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5aa381e3ea366641ef4fa423fdd46f8afb44b1b2a2ccb6c754007456149b75e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5aa381e3ea366641ef4fa423fdd46f8afb44b1b2a2ccb6c754007456149b75e6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T10:10:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svj5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e071fe863c4e9eb6ebecb3d58d1c4a3c66577cbe46568a4d80a5de5663636539\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e071fe863c4e9eb6ebecb3d58d1c4a3c66577cbe46568a4d80a5de5663636539\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T10:10:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T10:10:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svj5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ac924a3bb0825085e397dbc788641894675e9556fd8aeadcbf9a24a65e1a076\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4ac924a3bb0825085e397dbc788641894675e9556fd8aeadcbf9a24a65e1a076\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T10:10:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T10:10:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svj5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://91acce3dbd19205b8faf23208b7d65c0c47d8156eaf6d7d3357ce1ca64a3bb88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://91acce3dbd19205b8faf23208b7d65c0c47d8156eaf6d7d3357ce1ca64a3bb88\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T10:10:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T10:10:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svj5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://81482396eb82b6ef2f09d03cf5e6bc785209394a50c63b199581fc70120bb9fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://81482396eb82b6ef2f09d03cf5e6bc785209394a50c63b199581fc70120bb9fd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T10:10:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T10:10:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svj5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://556b84b0e9e72b0c26c58defa2c030943daffa83d97b60293cc5b5627a6c9174\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://556b84b0e9e72b0c26c58defa2c030943daffa83d97b60293cc5b5627a6c9174\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T10:10:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T10:10:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svj5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:10:13Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-xj9h5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:10:26Z is after 2025-08-24T17:21:41Z" Feb 24 10:10:26 crc kubenswrapper[4985]: I0224 10:10:26.372991 4985 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-hq52w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"11c1c7b8-18df-4583-849f-76b62544344b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8546cf975b145528b7f94bd03c6570de0db1a059477da5795b7569250bde3ca5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:10:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kcfbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://14cb7316040a9cb3cb3236ed16ecc17eee99c6935a646b86604ea8285e1aab0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:10:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kcfbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:10:13Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-hq52w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:10:26Z is after 2025-08-24T17:21:41Z" Feb 24 10:10:26 crc kubenswrapper[4985]: I0224 10:10:26.385102 4985 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-jj7jq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4776cd42-9a5c-4c2e-a585-a1f4a49d2d6d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6f45d6400640ba9b59793abc9aee79bb1c32439ff377abdf8b4ec4d1cd7c336a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:10:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rvmf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:10:13Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-jj7jq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:10:26Z is after 2025-08-24T17:21:41Z" Feb 24 10:10:26 crc kubenswrapper[4985]: I0224 10:10:26.401485 4985 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:10:26Z is after 2025-08-24T17:21:41Z" Feb 24 10:10:26 crc kubenswrapper[4985]: I0224 10:10:26.404184 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:10:26 crc kubenswrapper[4985]: I0224 10:10:26.404233 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:10:26 crc kubenswrapper[4985]: I0224 10:10:26.404250 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:10:26 crc kubenswrapper[4985]: I0224 10:10:26.404273 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 10:10:26 crc kubenswrapper[4985]: I0224 10:10:26.404287 4985 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T10:10:26Z","lastTransitionTime":"2026-02-24T10:10:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 10:10:26 crc kubenswrapper[4985]: I0224 10:10:26.415623 4985 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://137c9b42aa2109e16d0e273e3e7d116db30ed716c1e6f78fb8f4e87409e3f252\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:10:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://40736fa6ee59064a7fc5bed9b08671d094e7a11b6a8bf4ec45220a0d62156542\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:10:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:10:26Z is after 2025-08-24T17:21:41Z" Feb 24 10:10:26 crc kubenswrapper[4985]: I0224 10:10:26.433783 4985 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-27dpt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1b3986ef-e9be-43db-9350-ccc7dd3f713f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://97f91c2ddd00ceb58ea5691c7cb057bd7cfe8add5fd1dfd334f39b9def30db0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:10:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9n6c9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://af211a7b5dbd1e1f86f340d63afe03ad2f9691d166810837cc5988a24ad4f323\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:10:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9n6c9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3d76cc1e57df7809a67ce69b6690fd8cbe0ef9c364406b03bbb37bd110d65866\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:10:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9n6c9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cb7b5b1f27320e4bc00e48154c3bb41b1153e36c39a404cfc1d4a246f5909c5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:10:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9n6c9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e04b766ff2b4eb366ace7cbf978ef84bfa922494ec3d28f4cbd75efe1cb54f1e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:10:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9n6c9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7630d2e046fcee78924010a8de1aefcdf7c24ef4e70c4ea89c193983a0d88a61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:10:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9n6c9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://374d551e69c9bf5e4a9f9cc851abb3d18629f675d0be6f68664926f579bb45fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://59cf50a579cdd65a2981d9c24099f85b6f8708295e48f9ce8e81c9b2999e6189\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-24T10:10:22Z\\\",\\\"message\\\":\\\"or removal\\\\nI0224 10:10:22.971639 6685 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0224 10:10:22.971646 6685 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0224 10:10:22.971657 6685 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0224 10:10:22.971660 6685 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0224 10:10:22.971677 6685 handler.go:208] Removed *v1.Node event handler 2\\\\nI0224 10:10:22.971697 6685 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0224 10:10:22.971684 6685 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0224 10:10:22.971708 6685 handler.go:208] Removed *v1.Node event handler 7\\\\nI0224 10:10:22.971717 6685 factory.go:656] Stopping watch factory\\\\nI0224 10:10:22.971725 6685 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0224 10:10:22.971736 6685 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0224 10:10:22.971742 6685 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0224 10:10:22.971684 6685 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0224 10:10:22.971776 6685 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0224 10:10:22.972119 6685 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/f\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-24T10:10:20Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://374d551e69c9bf5e4a9f9cc851abb3d18629f675d0be6f68664926f579bb45fe\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-24T10:10:24Z\\\",\\\"message\\\":\\\"772757 6847 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0224 10:10:24.772762 6847 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0224 10:10:24.772771 6847 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0224 10:10:24.772784 6847 factory.go:656] Stopping watch factory\\\\nI0224 10:10:24.772785 6847 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0224 10:10:24.772793 6847 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI0224 10:10:24.772811 6847 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0224 10:10:24.772881 6847 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI0224 10:10:24.772946 6847 ovnkube.go:599] Stopped ovnkube\\\\nI0224 10:10:24.773029 6847 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI0224 10:10:24.773081 6847 loadbalancer.go:304] Deleted 0 stale LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-kube-scheduler/scheduler\\\\\\\"}\\\\nF0224 10:10:24.773103 6847 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: failed to add event handler: handler {0x1e60340 0x1e60020 0x1e5ffc0} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network contr\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-24T10:10:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9n6c9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e3faa6c1361aca1d102b6dbfaf38fa3094a5c2334713ce70a8bf2f58f19d6392\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:10:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9n6c9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8de70126733b3634f4e3c1576c3095c089e9628a6147ce8ea79fa5242a2eb920\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8de70126733b3634f4e3c1576c3095c089e9628a6147ce8ea79fa5242a2eb920\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T10:10:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9n6c9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:10:13Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-27dpt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:10:26Z is after 2025-08-24T17:21:41Z" Feb 24 10:10:26 crc kubenswrapper[4985]: I0224 10:10:26.443929 4985 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-qtqms" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c43e3252-1b22-48a0-8895-ad98fc88b7cf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://778506fd4d9af7b6ebd4e4d46282938146ba0184d9747ca657957c78118c10e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:10:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jdldr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:10:13Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-qtqms\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:10:26Z is after 2025-08-24T17:21:41Z" Feb 24 10:10:26 crc kubenswrapper[4985]: I0224 10:10:26.459954 4985 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c9e0873cd1521fe3cd14005f22118db4616043670f4b895435e71ddec9d82a0e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:10:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:10:26Z is after 2025-08-24T17:21:41Z" Feb 24 10:10:26 crc kubenswrapper[4985]: I0224 10:10:26.472363 4985 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:16Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:16Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e2f9d78f6a52c55d1f70fa119f84fdb11505d511d394a8ed61a298e79dfb2dcd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:10:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:10:26Z is after 2025-08-24T17:21:41Z" Feb 24 10:10:26 crc kubenswrapper[4985]: I0224 10:10:26.485571 4985 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-xkc65" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d4340d1a-60cb-4240-87ba-1e468c9c41cf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lvcfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lvcfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:10:13Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-xkc65\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:10:26Z is after 2025-08-24T17:21:41Z" Feb 24 10:10:26 crc kubenswrapper[4985]: I0224 10:10:26.506950 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:10:26 crc kubenswrapper[4985]: I0224 10:10:26.506993 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:10:26 crc kubenswrapper[4985]: I0224 10:10:26.507008 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:10:26 crc kubenswrapper[4985]: I0224 10:10:26.507028 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 10:10:26 crc kubenswrapper[4985]: I0224 10:10:26.507041 4985 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T10:10:26Z","lastTransitionTime":"2026-02-24T10:10:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 10:10:26 crc kubenswrapper[4985]: I0224 10:10:26.610024 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:10:26 crc kubenswrapper[4985]: I0224 10:10:26.610083 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:10:26 crc kubenswrapper[4985]: I0224 10:10:26.610102 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:10:26 crc kubenswrapper[4985]: I0224 10:10:26.610129 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 10:10:26 crc kubenswrapper[4985]: I0224 10:10:26.610152 4985 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T10:10:26Z","lastTransitionTime":"2026-02-24T10:10:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 10:10:26 crc kubenswrapper[4985]: I0224 10:10:26.712730 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:10:26 crc kubenswrapper[4985]: I0224 10:10:26.712768 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:10:26 crc kubenswrapper[4985]: I0224 10:10:26.712779 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:10:26 crc kubenswrapper[4985]: I0224 10:10:26.712795 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 10:10:26 crc kubenswrapper[4985]: I0224 10:10:26.712806 4985 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T10:10:26Z","lastTransitionTime":"2026-02-24T10:10:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 10:10:26 crc kubenswrapper[4985]: I0224 10:10:26.805728 4985 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-27dpt_1b3986ef-e9be-43db-9350-ccc7dd3f713f/ovnkube-controller/1.log" Feb 24 10:10:26 crc kubenswrapper[4985]: I0224 10:10:26.809550 4985 scope.go:117] "RemoveContainer" containerID="374d551e69c9bf5e4a9f9cc851abb3d18629f675d0be6f68664926f579bb45fe" Feb 24 10:10:26 crc kubenswrapper[4985]: E0224 10:10:26.809718 4985 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-27dpt_openshift-ovn-kubernetes(1b3986ef-e9be-43db-9350-ccc7dd3f713f)\"" pod="openshift-ovn-kubernetes/ovnkube-node-27dpt" podUID="1b3986ef-e9be-43db-9350-ccc7dd3f713f" Feb 24 10:10:26 crc kubenswrapper[4985]: I0224 10:10:26.814502 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:10:26 crc kubenswrapper[4985]: I0224 10:10:26.814539 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:10:26 crc kubenswrapper[4985]: I0224 10:10:26.814549 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:10:26 crc kubenswrapper[4985]: I0224 10:10:26.814568 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 10:10:26 crc kubenswrapper[4985]: I0224 10:10:26.814578 4985 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T10:10:26Z","lastTransitionTime":"2026-02-24T10:10:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 10:10:26 crc kubenswrapper[4985]: I0224 10:10:26.825236 4985 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:10:26Z is after 2025-08-24T17:21:41Z" Feb 24 10:10:26 crc kubenswrapper[4985]: I0224 10:10:26.838865 4985 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:10:26Z is after 2025-08-24T17:21:41Z" Feb 24 10:10:26 crc kubenswrapper[4985]: I0224 10:10:26.850136 4985 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-q24bf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"731349d2-7b07-4bc9-81f8-c7d75bca842a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://85e965ad144b3440864b3c9de70aacd2a2e49c715dca204b0e4cfae65954cd9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:10:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5g9sh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:10:13Z\\\"}}\" for pod \"openshift-multus\"/\"multus-q24bf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:10:26Z is after 2025-08-24T17:21:41Z" Feb 24 10:10:26 crc kubenswrapper[4985]: I0224 10:10:26.861320 4985 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-kkmm8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c3a34c00-910b-400b-bc96-9d805e076b7c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://65682d7b02ccbc0c344ab0b8eaf55d227d0b6e4a04e56d1038d51aa30747b068\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:10:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2zg8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://52f2dbaf1dfa5fd40bbf8a0d82613a96b05afccc71653f6d1f24db6674950b58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:10:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2zg8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:10:13Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-kkmm8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:10:26Z is after 2025-08-24T17:21:41Z" Feb 24 10:10:26 crc kubenswrapper[4985]: I0224 10:10:26.873805 4985 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a6395bdd-0d8e-4572-ae86-695e87aff12e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:08:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:08:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:08:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:08:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:08:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f9134e7cde3d6c701f928934ed4de91ee13bac8bc62dde5df6fe535a219fb57\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:08:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba53e3e1dfd92b0155e60254f21e53e06da9929560934d7bd95624f4a2bb619c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:08:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d059ea80117284da8c7eb82e4cd878eabba8f9d96102965708b9f6e0f8e96eb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:08:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dea2225b9a0112117f2f0bd6038ef636ac20e0cfcf3d608b720ad7aa2cacfa7c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6ee8b871ab25b79ee67f4e5735673469d19cfb336696170957d2439b0bca13af\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-24T10:09:32Z\\\",\\\"message\\\":\\\"le observer\\\\nW0224 10:09:31.800132 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0224 10:09:31.800281 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0224 10:09:31.801147 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1923306337/tls.crt::/tmp/serving-cert-1923306337/tls.key\\\\\\\"\\\\nI0224 10:09:31.990905 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0224 10:09:31.993717 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0224 10:09:31.993747 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0224 10:09:31.993768 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0224 10:09:31.993775 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0224 10:09:31.997392 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0224 10:09:31.997421 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0224 10:09:31.997427 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0224 10:09:31.997436 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nI0224 10:09:31.997435 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0224 10:09:31.997440 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0224 10:09:31.997467 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0224 10:09:31.997470 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0224 10:09:31.999573 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-24T10:09:31Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:10:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5cf6a4886c046added0a4380f1ca023def94c15e9d5c0ed472c714632684f158\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:08:39Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8da8e0d5478ae714a89c61cf4ca4142bc433bca4d4e4749f069c706abfed8c8f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8da8e0d5478ae714a89c61cf4ca4142bc433bca4d4e4749f069c706abfed8c8f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T10:08:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T10:08:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:08:36Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:10:26Z is after 2025-08-24T17:21:41Z" Feb 24 10:10:26 crc kubenswrapper[4985]: I0224 10:10:26.888309 4985 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-xj9h5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c4bc690c-38e6-488d-97b2-9bf37a916fe4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ab4787f21b4cef49e96547a1d1e7105c79d01d7863afd7ea013f9a38cb40dd06\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:10:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svj5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5aa381e3ea366641ef4fa423fdd46f8afb44b1b2a2ccb6c754007456149b75e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5aa381e3ea366641ef4fa423fdd46f8afb44b1b2a2ccb6c754007456149b75e6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T10:10:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svj5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e071fe863c4e9eb6ebecb3d58d1c4a3c66577cbe46568a4d80a5de5663636539\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e071fe863c4e9eb6ebecb3d58d1c4a3c66577cbe46568a4d80a5de5663636539\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T10:10:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T10:10:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svj5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ac924a3bb0825085e397dbc788641894675e9556fd8aeadcbf9a24a65e1a076\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4ac924a3bb0825085e397dbc788641894675e9556fd8aeadcbf9a24a65e1a076\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T10:10:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T10:10:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svj5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://91acce3dbd19205b8faf23208b7d65c0c47d8156eaf6d7d3357ce1ca64a3bb88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://91acce3dbd19205b8faf23208b7d65c0c47d8156eaf6d7d3357ce1ca64a3bb88\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T10:10:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T10:10:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svj5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://81482396eb82b6ef2f09d03cf5e6bc785209394a50c63b199581fc70120bb9fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://81482396eb82b6ef2f09d03cf5e6bc785209394a50c63b199581fc70120bb9fd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T10:10:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T10:10:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svj5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://556b84b0e9e72b0c26c58defa2c030943daffa83d97b60293cc5b5627a6c9174\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://556b84b0e9e72b0c26c58defa2c030943daffa83d97b60293cc5b5627a6c9174\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T10:10:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T10:10:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svj5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:10:13Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-xj9h5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:10:26Z is after 2025-08-24T17:21:41Z" Feb 24 10:10:26 crc kubenswrapper[4985]: I0224 10:10:26.897766 4985 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-hq52w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"11c1c7b8-18df-4583-849f-76b62544344b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8546cf975b145528b7f94bd03c6570de0db1a059477da5795b7569250bde3ca5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:10:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kcfbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://14cb7316040a9cb3cb3236ed16ecc17eee99c6935a646b86604ea8285e1aab0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:10:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kcfbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:10:13Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-hq52w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:10:26Z is after 2025-08-24T17:21:41Z" Feb 24 10:10:26 crc kubenswrapper[4985]: I0224 10:10:26.906981 4985 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-jj7jq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4776cd42-9a5c-4c2e-a585-a1f4a49d2d6d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6f45d6400640ba9b59793abc9aee79bb1c32439ff377abdf8b4ec4d1cd7c336a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:10:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rvmf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:10:13Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-jj7jq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:10:26Z is after 2025-08-24T17:21:41Z" Feb 24 10:10:26 crc kubenswrapper[4985]: I0224 10:10:26.917193 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:10:26 crc kubenswrapper[4985]: I0224 10:10:26.917245 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:10:26 crc kubenswrapper[4985]: I0224 10:10:26.917256 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:10:26 crc kubenswrapper[4985]: I0224 10:10:26.917273 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 10:10:26 crc kubenswrapper[4985]: I0224 10:10:26.917284 4985 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T10:10:26Z","lastTransitionTime":"2026-02-24T10:10:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 10:10:26 crc kubenswrapper[4985]: I0224 10:10:26.918622 4985 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:10:26Z is after 2025-08-24T17:21:41Z" Feb 24 10:10:26 crc kubenswrapper[4985]: I0224 10:10:26.930506 4985 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://137c9b42aa2109e16d0e273e3e7d116db30ed716c1e6f78fb8f4e87409e3f252\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:10:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://40736fa6ee59064a7fc5bed9b08671d094e7a11b6a8bf4ec45220a0d62156542\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:10:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:10:26Z is after 2025-08-24T17:21:41Z" Feb 24 10:10:26 crc kubenswrapper[4985]: I0224 10:10:26.939439 4985 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-qtqms" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c43e3252-1b22-48a0-8895-ad98fc88b7cf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://778506fd4d9af7b6ebd4e4d46282938146ba0184d9747ca657957c78118c10e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:10:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jdldr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:10:13Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-qtqms\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:10:26Z is after 2025-08-24T17:21:41Z" Feb 24 10:10:26 crc kubenswrapper[4985]: I0224 10:10:26.954171 4985 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c9e0873cd1521fe3cd14005f22118db4616043670f4b895435e71ddec9d82a0e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:10:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:10:26Z is after 2025-08-24T17:21:41Z" Feb 24 10:10:26 crc kubenswrapper[4985]: I0224 10:10:26.970550 4985 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:16Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:16Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e2f9d78f6a52c55d1f70fa119f84fdb11505d511d394a8ed61a298e79dfb2dcd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:10:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:10:26Z is after 2025-08-24T17:21:41Z" Feb 24 10:10:26 crc kubenswrapper[4985]: I0224 10:10:26.983932 4985 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-xkc65" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d4340d1a-60cb-4240-87ba-1e468c9c41cf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lvcfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lvcfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:10:13Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-xkc65\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:10:26Z is after 2025-08-24T17:21:41Z" Feb 24 10:10:27 crc kubenswrapper[4985]: I0224 10:10:27.007991 4985 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-27dpt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1b3986ef-e9be-43db-9350-ccc7dd3f713f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://97f91c2ddd00ceb58ea5691c7cb057bd7cfe8add5fd1dfd334f39b9def30db0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:10:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9n6c9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://af211a7b5dbd1e1f86f340d63afe03ad2f9691d166810837cc5988a24ad4f323\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:10:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9n6c9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3d76cc1e57df7809a67ce69b6690fd8cbe0ef9c364406b03bbb37bd110d65866\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:10:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9n6c9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cb7b5b1f27320e4bc00e48154c3bb41b1153e36c39a404cfc1d4a246f5909c5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:10:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9n6c9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e04b766ff2b4eb366ace7cbf978ef84bfa922494ec3d28f4cbd75efe1cb54f1e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:10:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9n6c9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7630d2e046fcee78924010a8de1aefcdf7c24ef4e70c4ea89c193983a0d88a61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:10:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9n6c9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://374d551e69c9bf5e4a9f9cc851abb3d18629f675d0be6f68664926f579bb45fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://374d551e69c9bf5e4a9f9cc851abb3d18629f675d0be6f68664926f579bb45fe\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-24T10:10:24Z\\\",\\\"message\\\":\\\"772757 6847 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0224 10:10:24.772762 6847 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0224 10:10:24.772771 6847 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0224 10:10:24.772784 6847 factory.go:656] Stopping watch factory\\\\nI0224 10:10:24.772785 6847 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0224 10:10:24.772793 6847 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI0224 10:10:24.772811 6847 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0224 10:10:24.772881 6847 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI0224 10:10:24.772946 6847 ovnkube.go:599] Stopped ovnkube\\\\nI0224 10:10:24.773029 6847 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI0224 10:10:24.773081 6847 loadbalancer.go:304] Deleted 0 stale LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-kube-scheduler/scheduler\\\\\\\"}\\\\nF0224 10:10:24.773103 6847 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: failed to add event handler: handler {0x1e60340 0x1e60020 0x1e5ffc0} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network contr\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-24T10:10:23Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-27dpt_openshift-ovn-kubernetes(1b3986ef-e9be-43db-9350-ccc7dd3f713f)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9n6c9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e3faa6c1361aca1d102b6dbfaf38fa3094a5c2334713ce70a8bf2f58f19d6392\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:10:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9n6c9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8de70126733b3634f4e3c1576c3095c089e9628a6147ce8ea79fa5242a2eb920\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8de70126733b3634f4e3c1576c3095c089e9628a6147ce8ea79fa5242a2eb920\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T10:10:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9n6c9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:10:13Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-27dpt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:10:27Z is after 2025-08-24T17:21:41Z" Feb 24 10:10:27 crc kubenswrapper[4985]: I0224 10:10:27.019520 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:10:27 crc kubenswrapper[4985]: I0224 10:10:27.019563 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:10:27 crc kubenswrapper[4985]: I0224 10:10:27.019576 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:10:27 crc kubenswrapper[4985]: I0224 10:10:27.019596 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 10:10:27 crc kubenswrapper[4985]: I0224 10:10:27.019608 4985 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T10:10:27Z","lastTransitionTime":"2026-02-24T10:10:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 10:10:27 crc kubenswrapper[4985]: I0224 10:10:27.122663 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:10:27 crc kubenswrapper[4985]: I0224 10:10:27.122702 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:10:27 crc kubenswrapper[4985]: I0224 10:10:27.122714 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:10:27 crc kubenswrapper[4985]: I0224 10:10:27.122738 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 10:10:27 crc kubenswrapper[4985]: I0224 10:10:27.122759 4985 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T10:10:27Z","lastTransitionTime":"2026-02-24T10:10:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 10:10:27 crc kubenswrapper[4985]: I0224 10:10:27.224792 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:10:27 crc kubenswrapper[4985]: I0224 10:10:27.224838 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:10:27 crc kubenswrapper[4985]: I0224 10:10:27.224851 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:10:27 crc kubenswrapper[4985]: I0224 10:10:27.224870 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 10:10:27 crc kubenswrapper[4985]: I0224 10:10:27.224881 4985 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T10:10:27Z","lastTransitionTime":"2026-02-24T10:10:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 10:10:27 crc kubenswrapper[4985]: I0224 10:10:27.237832 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:10:27 crc kubenswrapper[4985]: I0224 10:10:27.237868 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:10:27 crc kubenswrapper[4985]: I0224 10:10:27.237882 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:10:27 crc kubenswrapper[4985]: I0224 10:10:27.237917 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 10:10:27 crc kubenswrapper[4985]: I0224 10:10:27.237926 4985 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T10:10:27Z","lastTransitionTime":"2026-02-24T10:10:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 10:10:27 crc kubenswrapper[4985]: E0224 10:10:27.257015 4985 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T10:10:27Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:27Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T10:10:27Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:27Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T10:10:27Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:27Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T10:10:27Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:27Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"6fa96d71-6980-4f45-b02a-32602082091f\\\",\\\"systemUUID\\\":\\\"77848161-2cfb-4f0a-8e60-0ac9426710fd\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:10:27Z is after 2025-08-24T17:21:41Z" Feb 24 10:10:27 crc kubenswrapper[4985]: I0224 10:10:27.261577 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:10:27 crc kubenswrapper[4985]: I0224 10:10:27.261642 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:10:27 crc kubenswrapper[4985]: I0224 10:10:27.261666 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:10:27 crc kubenswrapper[4985]: I0224 10:10:27.261699 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 10:10:27 crc kubenswrapper[4985]: I0224 10:10:27.261721 4985 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T10:10:27Z","lastTransitionTime":"2026-02-24T10:10:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 10:10:27 crc kubenswrapper[4985]: I0224 10:10:27.263906 4985 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xkc65" Feb 24 10:10:27 crc kubenswrapper[4985]: E0224 10:10:27.264057 4985 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xkc65" podUID="d4340d1a-60cb-4240-87ba-1e468c9c41cf" Feb 24 10:10:27 crc kubenswrapper[4985]: I0224 10:10:27.264096 4985 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 24 10:10:27 crc kubenswrapper[4985]: I0224 10:10:27.264123 4985 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 24 10:10:27 crc kubenswrapper[4985]: I0224 10:10:27.264207 4985 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 24 10:10:27 crc kubenswrapper[4985]: E0224 10:10:27.264300 4985 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 24 10:10:27 crc kubenswrapper[4985]: E0224 10:10:27.264696 4985 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 24 10:10:27 crc kubenswrapper[4985]: E0224 10:10:27.264657 4985 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 24 10:10:27 crc kubenswrapper[4985]: E0224 10:10:27.276594 4985 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T10:10:27Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:27Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T10:10:27Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:27Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T10:10:27Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:27Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T10:10:27Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:27Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"6fa96d71-6980-4f45-b02a-32602082091f\\\",\\\"systemUUID\\\":\\\"77848161-2cfb-4f0a-8e60-0ac9426710fd\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:10:27Z is after 2025-08-24T17:21:41Z" Feb 24 10:10:27 crc kubenswrapper[4985]: I0224 10:10:27.281231 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:10:27 crc kubenswrapper[4985]: I0224 10:10:27.281304 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:10:27 crc kubenswrapper[4985]: I0224 10:10:27.281325 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:10:27 crc kubenswrapper[4985]: I0224 10:10:27.281354 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 10:10:27 crc kubenswrapper[4985]: I0224 10:10:27.281384 4985 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T10:10:27Z","lastTransitionTime":"2026-02-24T10:10:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 10:10:27 crc kubenswrapper[4985]: E0224 10:10:27.293237 4985 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T10:10:27Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:27Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T10:10:27Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:27Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T10:10:27Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:27Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T10:10:27Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:27Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"6fa96d71-6980-4f45-b02a-32602082091f\\\",\\\"systemUUID\\\":\\\"77848161-2cfb-4f0a-8e60-0ac9426710fd\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:10:27Z is after 2025-08-24T17:21:41Z" Feb 24 10:10:27 crc kubenswrapper[4985]: I0224 10:10:27.298316 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:10:27 crc kubenswrapper[4985]: I0224 10:10:27.298373 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:10:27 crc kubenswrapper[4985]: I0224 10:10:27.298389 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:10:27 crc kubenswrapper[4985]: I0224 10:10:27.298413 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 10:10:27 crc kubenswrapper[4985]: I0224 10:10:27.298433 4985 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T10:10:27Z","lastTransitionTime":"2026-02-24T10:10:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 10:10:27 crc kubenswrapper[4985]: E0224 10:10:27.312421 4985 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T10:10:27Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:27Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T10:10:27Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:27Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T10:10:27Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:27Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T10:10:27Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:27Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"6fa96d71-6980-4f45-b02a-32602082091f\\\",\\\"systemUUID\\\":\\\"77848161-2cfb-4f0a-8e60-0ac9426710fd\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:10:27Z is after 2025-08-24T17:21:41Z" Feb 24 10:10:27 crc kubenswrapper[4985]: I0224 10:10:27.317345 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:10:27 crc kubenswrapper[4985]: I0224 10:10:27.317412 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:10:27 crc kubenswrapper[4985]: I0224 10:10:27.317425 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:10:27 crc kubenswrapper[4985]: I0224 10:10:27.317458 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 10:10:27 crc kubenswrapper[4985]: I0224 10:10:27.317475 4985 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T10:10:27Z","lastTransitionTime":"2026-02-24T10:10:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 10:10:27 crc kubenswrapper[4985]: E0224 10:10:27.334233 4985 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T10:10:27Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:27Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T10:10:27Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:27Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T10:10:27Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:27Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T10:10:27Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:27Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"6fa96d71-6980-4f45-b02a-32602082091f\\\",\\\"systemUUID\\\":\\\"77848161-2cfb-4f0a-8e60-0ac9426710fd\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:10:27Z is after 2025-08-24T17:21:41Z" Feb 24 10:10:27 crc kubenswrapper[4985]: E0224 10:10:27.334412 4985 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Feb 24 10:10:27 crc kubenswrapper[4985]: I0224 10:10:27.336447 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:10:27 crc kubenswrapper[4985]: I0224 10:10:27.336507 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:10:27 crc kubenswrapper[4985]: I0224 10:10:27.336519 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:10:27 crc kubenswrapper[4985]: I0224 10:10:27.336533 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 10:10:27 crc kubenswrapper[4985]: I0224 10:10:27.336543 4985 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T10:10:27Z","lastTransitionTime":"2026-02-24T10:10:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 10:10:27 crc kubenswrapper[4985]: I0224 10:10:27.439446 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:10:27 crc kubenswrapper[4985]: I0224 10:10:27.439523 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:10:27 crc kubenswrapper[4985]: I0224 10:10:27.439543 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:10:27 crc kubenswrapper[4985]: I0224 10:10:27.439575 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 10:10:27 crc kubenswrapper[4985]: I0224 10:10:27.439595 4985 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T10:10:27Z","lastTransitionTime":"2026-02-24T10:10:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 10:10:27 crc kubenswrapper[4985]: I0224 10:10:27.544332 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:10:27 crc kubenswrapper[4985]: I0224 10:10:27.544405 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:10:27 crc kubenswrapper[4985]: I0224 10:10:27.544423 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:10:27 crc kubenswrapper[4985]: I0224 10:10:27.544455 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 10:10:27 crc kubenswrapper[4985]: I0224 10:10:27.544473 4985 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T10:10:27Z","lastTransitionTime":"2026-02-24T10:10:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 10:10:27 crc kubenswrapper[4985]: I0224 10:10:27.655514 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:10:27 crc kubenswrapper[4985]: I0224 10:10:27.655868 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:10:27 crc kubenswrapper[4985]: I0224 10:10:27.655881 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:10:27 crc kubenswrapper[4985]: I0224 10:10:27.655922 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 10:10:27 crc kubenswrapper[4985]: I0224 10:10:27.655934 4985 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T10:10:27Z","lastTransitionTime":"2026-02-24T10:10:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 10:10:27 crc kubenswrapper[4985]: I0224 10:10:27.758334 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:10:27 crc kubenswrapper[4985]: I0224 10:10:27.758373 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:10:27 crc kubenswrapper[4985]: I0224 10:10:27.758384 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:10:27 crc kubenswrapper[4985]: I0224 10:10:27.758402 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 10:10:27 crc kubenswrapper[4985]: I0224 10:10:27.758414 4985 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T10:10:27Z","lastTransitionTime":"2026-02-24T10:10:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 10:10:27 crc kubenswrapper[4985]: I0224 10:10:27.860853 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:10:27 crc kubenswrapper[4985]: I0224 10:10:27.860925 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:10:27 crc kubenswrapper[4985]: I0224 10:10:27.860938 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:10:27 crc kubenswrapper[4985]: I0224 10:10:27.860957 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 10:10:27 crc kubenswrapper[4985]: I0224 10:10:27.860969 4985 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T10:10:27Z","lastTransitionTime":"2026-02-24T10:10:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 10:10:27 crc kubenswrapper[4985]: I0224 10:10:27.964062 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:10:27 crc kubenswrapper[4985]: I0224 10:10:27.964110 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:10:27 crc kubenswrapper[4985]: I0224 10:10:27.964125 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:10:27 crc kubenswrapper[4985]: I0224 10:10:27.964142 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 10:10:27 crc kubenswrapper[4985]: I0224 10:10:27.964152 4985 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T10:10:27Z","lastTransitionTime":"2026-02-24T10:10:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 10:10:28 crc kubenswrapper[4985]: I0224 10:10:28.066736 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:10:28 crc kubenswrapper[4985]: I0224 10:10:28.066819 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:10:28 crc kubenswrapper[4985]: I0224 10:10:28.066847 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:10:28 crc kubenswrapper[4985]: I0224 10:10:28.066933 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 10:10:28 crc kubenswrapper[4985]: I0224 10:10:28.066966 4985 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T10:10:28Z","lastTransitionTime":"2026-02-24T10:10:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 10:10:28 crc kubenswrapper[4985]: I0224 10:10:28.170730 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:10:28 crc kubenswrapper[4985]: I0224 10:10:28.170825 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:10:28 crc kubenswrapper[4985]: I0224 10:10:28.170847 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:10:28 crc kubenswrapper[4985]: I0224 10:10:28.170881 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 10:10:28 crc kubenswrapper[4985]: I0224 10:10:28.170943 4985 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T10:10:28Z","lastTransitionTime":"2026-02-24T10:10:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 10:10:28 crc kubenswrapper[4985]: I0224 10:10:28.273970 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:10:28 crc kubenswrapper[4985]: I0224 10:10:28.274047 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:10:28 crc kubenswrapper[4985]: I0224 10:10:28.274069 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:10:28 crc kubenswrapper[4985]: I0224 10:10:28.274097 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 10:10:28 crc kubenswrapper[4985]: I0224 10:10:28.274116 4985 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T10:10:28Z","lastTransitionTime":"2026-02-24T10:10:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 10:10:28 crc kubenswrapper[4985]: I0224 10:10:28.384731 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:10:28 crc kubenswrapper[4985]: I0224 10:10:28.384810 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:10:28 crc kubenswrapper[4985]: I0224 10:10:28.384832 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:10:28 crc kubenswrapper[4985]: I0224 10:10:28.384860 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 10:10:28 crc kubenswrapper[4985]: I0224 10:10:28.384942 4985 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T10:10:28Z","lastTransitionTime":"2026-02-24T10:10:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 10:10:28 crc kubenswrapper[4985]: I0224 10:10:28.488056 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:10:28 crc kubenswrapper[4985]: I0224 10:10:28.488144 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:10:28 crc kubenswrapper[4985]: I0224 10:10:28.488171 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:10:28 crc kubenswrapper[4985]: I0224 10:10:28.488206 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 10:10:28 crc kubenswrapper[4985]: I0224 10:10:28.488231 4985 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T10:10:28Z","lastTransitionTime":"2026-02-24T10:10:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 10:10:28 crc kubenswrapper[4985]: I0224 10:10:28.591598 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:10:28 crc kubenswrapper[4985]: I0224 10:10:28.591665 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:10:28 crc kubenswrapper[4985]: I0224 10:10:28.591676 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:10:28 crc kubenswrapper[4985]: I0224 10:10:28.591693 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 10:10:28 crc kubenswrapper[4985]: I0224 10:10:28.591704 4985 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T10:10:28Z","lastTransitionTime":"2026-02-24T10:10:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 10:10:28 crc kubenswrapper[4985]: I0224 10:10:28.693852 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:10:28 crc kubenswrapper[4985]: I0224 10:10:28.693931 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:10:28 crc kubenswrapper[4985]: I0224 10:10:28.693946 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:10:28 crc kubenswrapper[4985]: I0224 10:10:28.693971 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 10:10:28 crc kubenswrapper[4985]: I0224 10:10:28.693984 4985 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T10:10:28Z","lastTransitionTime":"2026-02-24T10:10:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 10:10:28 crc kubenswrapper[4985]: I0224 10:10:28.796401 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:10:28 crc kubenswrapper[4985]: I0224 10:10:28.796435 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:10:28 crc kubenswrapper[4985]: I0224 10:10:28.796445 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:10:28 crc kubenswrapper[4985]: I0224 10:10:28.796460 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 10:10:28 crc kubenswrapper[4985]: I0224 10:10:28.796469 4985 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T10:10:28Z","lastTransitionTime":"2026-02-24T10:10:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 10:10:28 crc kubenswrapper[4985]: I0224 10:10:28.898994 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:10:28 crc kubenswrapper[4985]: I0224 10:10:28.899047 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:10:28 crc kubenswrapper[4985]: I0224 10:10:28.899065 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:10:28 crc kubenswrapper[4985]: I0224 10:10:28.899089 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 10:10:28 crc kubenswrapper[4985]: I0224 10:10:28.899106 4985 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T10:10:28Z","lastTransitionTime":"2026-02-24T10:10:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 10:10:29 crc kubenswrapper[4985]: I0224 10:10:29.002516 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:10:29 crc kubenswrapper[4985]: I0224 10:10:29.002593 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:10:29 crc kubenswrapper[4985]: I0224 10:10:29.002617 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:10:29 crc kubenswrapper[4985]: I0224 10:10:29.002655 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 10:10:29 crc kubenswrapper[4985]: I0224 10:10:29.002680 4985 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T10:10:29Z","lastTransitionTime":"2026-02-24T10:10:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 10:10:29 crc kubenswrapper[4985]: I0224 10:10:29.078496 4985 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 24 10:10:29 crc kubenswrapper[4985]: I0224 10:10:29.078646 4985 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 24 10:10:29 crc kubenswrapper[4985]: I0224 10:10:29.078696 4985 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 24 10:10:29 crc kubenswrapper[4985]: I0224 10:10:29.078742 4985 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 24 10:10:29 crc kubenswrapper[4985]: E0224 10:10:29.078877 4985 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-24 10:10:45.078840566 +0000 UTC m=+129.553033166 (durationBeforeRetry 16s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 24 10:10:29 crc kubenswrapper[4985]: E0224 10:10:29.078955 4985 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Feb 24 10:10:29 crc kubenswrapper[4985]: E0224 10:10:29.079016 4985 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 24 10:10:29 crc kubenswrapper[4985]: E0224 10:10:29.079072 4985 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 24 10:10:29 crc kubenswrapper[4985]: E0224 10:10:29.079098 4985 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 24 10:10:29 crc kubenswrapper[4985]: E0224 10:10:29.079022 4985 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 24 10:10:29 crc kubenswrapper[4985]: E0224 10:10:29.079074 4985 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-24 10:10:45.079043002 +0000 UTC m=+129.553235642 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Feb 24 10:10:29 crc kubenswrapper[4985]: E0224 10:10:29.079204 4985 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-24 10:10:45.079186576 +0000 UTC m=+129.553379176 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 24 10:10:29 crc kubenswrapper[4985]: E0224 10:10:29.079233 4985 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-02-24 10:10:45.079220777 +0000 UTC m=+129.553413377 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 24 10:10:29 crc kubenswrapper[4985]: I0224 10:10:29.105663 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:10:29 crc kubenswrapper[4985]: I0224 10:10:29.105727 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:10:29 crc kubenswrapper[4985]: I0224 10:10:29.105740 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:10:29 crc kubenswrapper[4985]: I0224 10:10:29.105760 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 10:10:29 crc kubenswrapper[4985]: I0224 10:10:29.105775 4985 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T10:10:29Z","lastTransitionTime":"2026-02-24T10:10:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 10:10:29 crc kubenswrapper[4985]: I0224 10:10:29.179448 4985 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 24 10:10:29 crc kubenswrapper[4985]: I0224 10:10:29.179533 4985 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d4340d1a-60cb-4240-87ba-1e468c9c41cf-metrics-certs\") pod \"network-metrics-daemon-xkc65\" (UID: \"d4340d1a-60cb-4240-87ba-1e468c9c41cf\") " pod="openshift-multus/network-metrics-daemon-xkc65" Feb 24 10:10:29 crc kubenswrapper[4985]: E0224 10:10:29.179766 4985 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 24 10:10:29 crc kubenswrapper[4985]: E0224 10:10:29.179827 4985 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Feb 24 10:10:29 crc kubenswrapper[4985]: E0224 10:10:29.179841 4985 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 24 10:10:29 crc kubenswrapper[4985]: E0224 10:10:29.179999 4985 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d4340d1a-60cb-4240-87ba-1e468c9c41cf-metrics-certs podName:d4340d1a-60cb-4240-87ba-1e468c9c41cf nodeName:}" failed. No retries permitted until 2026-02-24 10:10:45.179959175 +0000 UTC m=+129.654151785 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/d4340d1a-60cb-4240-87ba-1e468c9c41cf-metrics-certs") pod "network-metrics-daemon-xkc65" (UID: "d4340d1a-60cb-4240-87ba-1e468c9c41cf") : object "openshift-multus"/"metrics-daemon-secret" not registered Feb 24 10:10:29 crc kubenswrapper[4985]: E0224 10:10:29.180015 4985 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 24 10:10:29 crc kubenswrapper[4985]: E0224 10:10:29.180090 4985 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-02-24 10:10:45.180066588 +0000 UTC m=+129.654259228 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 24 10:10:29 crc kubenswrapper[4985]: I0224 10:10:29.208752 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:10:29 crc kubenswrapper[4985]: I0224 10:10:29.208802 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:10:29 crc kubenswrapper[4985]: I0224 10:10:29.208819 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:10:29 crc kubenswrapper[4985]: I0224 10:10:29.208843 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 10:10:29 crc kubenswrapper[4985]: I0224 10:10:29.208861 4985 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T10:10:29Z","lastTransitionTime":"2026-02-24T10:10:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 10:10:29 crc kubenswrapper[4985]: I0224 10:10:29.263790 4985 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 24 10:10:29 crc kubenswrapper[4985]: I0224 10:10:29.263842 4985 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 24 10:10:29 crc kubenswrapper[4985]: E0224 10:10:29.263918 4985 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 24 10:10:29 crc kubenswrapper[4985]: I0224 10:10:29.263800 4985 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xkc65" Feb 24 10:10:29 crc kubenswrapper[4985]: E0224 10:10:29.264118 4985 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 24 10:10:29 crc kubenswrapper[4985]: I0224 10:10:29.264709 4985 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 24 10:10:29 crc kubenswrapper[4985]: E0224 10:10:29.264810 4985 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xkc65" podUID="d4340d1a-60cb-4240-87ba-1e468c9c41cf" Feb 24 10:10:29 crc kubenswrapper[4985]: E0224 10:10:29.265007 4985 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 24 10:10:29 crc kubenswrapper[4985]: I0224 10:10:29.310982 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:10:29 crc kubenswrapper[4985]: I0224 10:10:29.311039 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:10:29 crc kubenswrapper[4985]: I0224 10:10:29.311058 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:10:29 crc kubenswrapper[4985]: I0224 10:10:29.311085 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 10:10:29 crc kubenswrapper[4985]: I0224 10:10:29.311103 4985 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T10:10:29Z","lastTransitionTime":"2026-02-24T10:10:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 10:10:29 crc kubenswrapper[4985]: I0224 10:10:29.414398 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:10:29 crc kubenswrapper[4985]: I0224 10:10:29.414450 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:10:29 crc kubenswrapper[4985]: I0224 10:10:29.414467 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:10:29 crc kubenswrapper[4985]: I0224 10:10:29.414489 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 10:10:29 crc kubenswrapper[4985]: I0224 10:10:29.414507 4985 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T10:10:29Z","lastTransitionTime":"2026-02-24T10:10:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 10:10:29 crc kubenswrapper[4985]: I0224 10:10:29.517142 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:10:29 crc kubenswrapper[4985]: I0224 10:10:29.517208 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:10:29 crc kubenswrapper[4985]: I0224 10:10:29.517229 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:10:29 crc kubenswrapper[4985]: I0224 10:10:29.517253 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 10:10:29 crc kubenswrapper[4985]: I0224 10:10:29.517271 4985 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T10:10:29Z","lastTransitionTime":"2026-02-24T10:10:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 10:10:29 crc kubenswrapper[4985]: I0224 10:10:29.620112 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:10:29 crc kubenswrapper[4985]: I0224 10:10:29.620216 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:10:29 crc kubenswrapper[4985]: I0224 10:10:29.620239 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:10:29 crc kubenswrapper[4985]: I0224 10:10:29.620265 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 10:10:29 crc kubenswrapper[4985]: I0224 10:10:29.620293 4985 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T10:10:29Z","lastTransitionTime":"2026-02-24T10:10:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 10:10:29 crc kubenswrapper[4985]: I0224 10:10:29.723766 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:10:29 crc kubenswrapper[4985]: I0224 10:10:29.723826 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:10:29 crc kubenswrapper[4985]: I0224 10:10:29.723843 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:10:29 crc kubenswrapper[4985]: I0224 10:10:29.723866 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 10:10:29 crc kubenswrapper[4985]: I0224 10:10:29.723916 4985 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T10:10:29Z","lastTransitionTime":"2026-02-24T10:10:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 10:10:29 crc kubenswrapper[4985]: I0224 10:10:29.826399 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:10:29 crc kubenswrapper[4985]: I0224 10:10:29.826465 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:10:29 crc kubenswrapper[4985]: I0224 10:10:29.826483 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:10:29 crc kubenswrapper[4985]: I0224 10:10:29.826512 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 10:10:29 crc kubenswrapper[4985]: I0224 10:10:29.826547 4985 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T10:10:29Z","lastTransitionTime":"2026-02-24T10:10:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 10:10:29 crc kubenswrapper[4985]: I0224 10:10:29.930246 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:10:29 crc kubenswrapper[4985]: I0224 10:10:29.930321 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:10:29 crc kubenswrapper[4985]: I0224 10:10:29.930353 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:10:29 crc kubenswrapper[4985]: I0224 10:10:29.930388 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 10:10:29 crc kubenswrapper[4985]: I0224 10:10:29.930408 4985 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T10:10:29Z","lastTransitionTime":"2026-02-24T10:10:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 10:10:30 crc kubenswrapper[4985]: I0224 10:10:30.032795 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:10:30 crc kubenswrapper[4985]: I0224 10:10:30.032841 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:10:30 crc kubenswrapper[4985]: I0224 10:10:30.032852 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:10:30 crc kubenswrapper[4985]: I0224 10:10:30.032869 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 10:10:30 crc kubenswrapper[4985]: I0224 10:10:30.032882 4985 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T10:10:30Z","lastTransitionTime":"2026-02-24T10:10:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 10:10:30 crc kubenswrapper[4985]: I0224 10:10:30.136055 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:10:30 crc kubenswrapper[4985]: I0224 10:10:30.136136 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:10:30 crc kubenswrapper[4985]: I0224 10:10:30.136149 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:10:30 crc kubenswrapper[4985]: I0224 10:10:30.136168 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 10:10:30 crc kubenswrapper[4985]: I0224 10:10:30.136180 4985 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T10:10:30Z","lastTransitionTime":"2026-02-24T10:10:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 10:10:30 crc kubenswrapper[4985]: I0224 10:10:30.238668 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:10:30 crc kubenswrapper[4985]: I0224 10:10:30.238717 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:10:30 crc kubenswrapper[4985]: I0224 10:10:30.238766 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:10:30 crc kubenswrapper[4985]: I0224 10:10:30.238785 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 10:10:30 crc kubenswrapper[4985]: I0224 10:10:30.238798 4985 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T10:10:30Z","lastTransitionTime":"2026-02-24T10:10:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 10:10:30 crc kubenswrapper[4985]: I0224 10:10:30.284777 4985 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd/etcd-crc"] Feb 24 10:10:30 crc kubenswrapper[4985]: I0224 10:10:30.340919 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:10:30 crc kubenswrapper[4985]: I0224 10:10:30.340958 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:10:30 crc kubenswrapper[4985]: I0224 10:10:30.340969 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:10:30 crc kubenswrapper[4985]: I0224 10:10:30.340990 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 10:10:30 crc kubenswrapper[4985]: I0224 10:10:30.341002 4985 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T10:10:30Z","lastTransitionTime":"2026-02-24T10:10:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 10:10:30 crc kubenswrapper[4985]: I0224 10:10:30.443524 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:10:30 crc kubenswrapper[4985]: I0224 10:10:30.443589 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:10:30 crc kubenswrapper[4985]: I0224 10:10:30.443602 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:10:30 crc kubenswrapper[4985]: I0224 10:10:30.443620 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 10:10:30 crc kubenswrapper[4985]: I0224 10:10:30.443631 4985 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T10:10:30Z","lastTransitionTime":"2026-02-24T10:10:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 10:10:30 crc kubenswrapper[4985]: I0224 10:10:30.546182 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:10:30 crc kubenswrapper[4985]: I0224 10:10:30.546434 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:10:30 crc kubenswrapper[4985]: I0224 10:10:30.546515 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:10:30 crc kubenswrapper[4985]: I0224 10:10:30.546632 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 10:10:30 crc kubenswrapper[4985]: I0224 10:10:30.546719 4985 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T10:10:30Z","lastTransitionTime":"2026-02-24T10:10:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 10:10:30 crc kubenswrapper[4985]: I0224 10:10:30.648722 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:10:30 crc kubenswrapper[4985]: I0224 10:10:30.648842 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:10:30 crc kubenswrapper[4985]: I0224 10:10:30.648854 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:10:30 crc kubenswrapper[4985]: I0224 10:10:30.648867 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 10:10:30 crc kubenswrapper[4985]: I0224 10:10:30.648877 4985 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T10:10:30Z","lastTransitionTime":"2026-02-24T10:10:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 10:10:30 crc kubenswrapper[4985]: I0224 10:10:30.751901 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:10:30 crc kubenswrapper[4985]: I0224 10:10:30.751941 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:10:30 crc kubenswrapper[4985]: I0224 10:10:30.751953 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:10:30 crc kubenswrapper[4985]: I0224 10:10:30.751968 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 10:10:30 crc kubenswrapper[4985]: I0224 10:10:30.751979 4985 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T10:10:30Z","lastTransitionTime":"2026-02-24T10:10:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 10:10:30 crc kubenswrapper[4985]: I0224 10:10:30.853865 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:10:30 crc kubenswrapper[4985]: I0224 10:10:30.853943 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:10:30 crc kubenswrapper[4985]: I0224 10:10:30.853954 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:10:30 crc kubenswrapper[4985]: I0224 10:10:30.853970 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 10:10:30 crc kubenswrapper[4985]: I0224 10:10:30.853982 4985 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T10:10:30Z","lastTransitionTime":"2026-02-24T10:10:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 10:10:30 crc kubenswrapper[4985]: I0224 10:10:30.957259 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:10:30 crc kubenswrapper[4985]: I0224 10:10:30.957314 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:10:30 crc kubenswrapper[4985]: I0224 10:10:30.957326 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:10:30 crc kubenswrapper[4985]: I0224 10:10:30.957344 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 10:10:30 crc kubenswrapper[4985]: I0224 10:10:30.957356 4985 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T10:10:30Z","lastTransitionTime":"2026-02-24T10:10:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 10:10:31 crc kubenswrapper[4985]: I0224 10:10:31.059243 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:10:31 crc kubenswrapper[4985]: I0224 10:10:31.059325 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:10:31 crc kubenswrapper[4985]: I0224 10:10:31.059343 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:10:31 crc kubenswrapper[4985]: I0224 10:10:31.059375 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 10:10:31 crc kubenswrapper[4985]: I0224 10:10:31.059393 4985 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T10:10:31Z","lastTransitionTime":"2026-02-24T10:10:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 10:10:31 crc kubenswrapper[4985]: I0224 10:10:31.161609 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:10:31 crc kubenswrapper[4985]: I0224 10:10:31.161673 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:10:31 crc kubenswrapper[4985]: I0224 10:10:31.161691 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:10:31 crc kubenswrapper[4985]: I0224 10:10:31.161720 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 10:10:31 crc kubenswrapper[4985]: I0224 10:10:31.161747 4985 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T10:10:31Z","lastTransitionTime":"2026-02-24T10:10:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 10:10:31 crc kubenswrapper[4985]: I0224 10:10:31.263979 4985 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xkc65" Feb 24 10:10:31 crc kubenswrapper[4985]: I0224 10:10:31.264025 4985 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 24 10:10:31 crc kubenswrapper[4985]: I0224 10:10:31.264044 4985 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 24 10:10:31 crc kubenswrapper[4985]: I0224 10:10:31.264088 4985 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 24 10:10:31 crc kubenswrapper[4985]: I0224 10:10:31.264179 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:10:31 crc kubenswrapper[4985]: E0224 10:10:31.264207 4985 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xkc65" podUID="d4340d1a-60cb-4240-87ba-1e468c9c41cf" Feb 24 10:10:31 crc kubenswrapper[4985]: I0224 10:10:31.264226 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:10:31 crc kubenswrapper[4985]: I0224 10:10:31.264290 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:10:31 crc kubenswrapper[4985]: E0224 10:10:31.264305 4985 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 24 10:10:31 crc kubenswrapper[4985]: I0224 10:10:31.264317 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 10:10:31 crc kubenswrapper[4985]: I0224 10:10:31.264356 4985 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T10:10:31Z","lastTransitionTime":"2026-02-24T10:10:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 10:10:31 crc kubenswrapper[4985]: E0224 10:10:31.264406 4985 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 24 10:10:31 crc kubenswrapper[4985]: E0224 10:10:31.264498 4985 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 24 10:10:31 crc kubenswrapper[4985]: I0224 10:10:31.366995 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:10:31 crc kubenswrapper[4985]: I0224 10:10:31.367052 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:10:31 crc kubenswrapper[4985]: I0224 10:10:31.367065 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:10:31 crc kubenswrapper[4985]: I0224 10:10:31.367080 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 10:10:31 crc kubenswrapper[4985]: I0224 10:10:31.367091 4985 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T10:10:31Z","lastTransitionTime":"2026-02-24T10:10:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 10:10:31 crc kubenswrapper[4985]: I0224 10:10:31.469730 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:10:31 crc kubenswrapper[4985]: I0224 10:10:31.469772 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:10:31 crc kubenswrapper[4985]: I0224 10:10:31.469782 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:10:31 crc kubenswrapper[4985]: I0224 10:10:31.469798 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 10:10:31 crc kubenswrapper[4985]: I0224 10:10:31.469808 4985 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T10:10:31Z","lastTransitionTime":"2026-02-24T10:10:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 10:10:31 crc kubenswrapper[4985]: I0224 10:10:31.572525 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:10:31 crc kubenswrapper[4985]: I0224 10:10:31.572599 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:10:31 crc kubenswrapper[4985]: I0224 10:10:31.572612 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:10:31 crc kubenswrapper[4985]: I0224 10:10:31.572630 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 10:10:31 crc kubenswrapper[4985]: I0224 10:10:31.572642 4985 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T10:10:31Z","lastTransitionTime":"2026-02-24T10:10:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 10:10:31 crc kubenswrapper[4985]: I0224 10:10:31.675455 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:10:31 crc kubenswrapper[4985]: I0224 10:10:31.675515 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:10:31 crc kubenswrapper[4985]: I0224 10:10:31.675535 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:10:31 crc kubenswrapper[4985]: I0224 10:10:31.675560 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 10:10:31 crc kubenswrapper[4985]: I0224 10:10:31.675577 4985 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T10:10:31Z","lastTransitionTime":"2026-02-24T10:10:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 10:10:31 crc kubenswrapper[4985]: I0224 10:10:31.778338 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:10:31 crc kubenswrapper[4985]: I0224 10:10:31.778384 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:10:31 crc kubenswrapper[4985]: I0224 10:10:31.778399 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:10:31 crc kubenswrapper[4985]: I0224 10:10:31.778424 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 10:10:31 crc kubenswrapper[4985]: I0224 10:10:31.778442 4985 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T10:10:31Z","lastTransitionTime":"2026-02-24T10:10:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 10:10:31 crc kubenswrapper[4985]: I0224 10:10:31.881032 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:10:31 crc kubenswrapper[4985]: I0224 10:10:31.881187 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:10:31 crc kubenswrapper[4985]: I0224 10:10:31.881207 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:10:31 crc kubenswrapper[4985]: I0224 10:10:31.881248 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 10:10:31 crc kubenswrapper[4985]: I0224 10:10:31.881264 4985 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T10:10:31Z","lastTransitionTime":"2026-02-24T10:10:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 10:10:31 crc kubenswrapper[4985]: I0224 10:10:31.984158 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:10:31 crc kubenswrapper[4985]: I0224 10:10:31.984198 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:10:31 crc kubenswrapper[4985]: I0224 10:10:31.984207 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:10:31 crc kubenswrapper[4985]: I0224 10:10:31.984220 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 10:10:31 crc kubenswrapper[4985]: I0224 10:10:31.984233 4985 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T10:10:31Z","lastTransitionTime":"2026-02-24T10:10:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 10:10:32 crc kubenswrapper[4985]: I0224 10:10:32.087308 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:10:32 crc kubenswrapper[4985]: I0224 10:10:32.087342 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:10:32 crc kubenswrapper[4985]: I0224 10:10:32.087353 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:10:32 crc kubenswrapper[4985]: I0224 10:10:32.087366 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 10:10:32 crc kubenswrapper[4985]: I0224 10:10:32.087376 4985 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T10:10:32Z","lastTransitionTime":"2026-02-24T10:10:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 10:10:32 crc kubenswrapper[4985]: I0224 10:10:32.190505 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:10:32 crc kubenswrapper[4985]: I0224 10:10:32.190554 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:10:32 crc kubenswrapper[4985]: I0224 10:10:32.190567 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:10:32 crc kubenswrapper[4985]: I0224 10:10:32.190586 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 10:10:32 crc kubenswrapper[4985]: I0224 10:10:32.190602 4985 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T10:10:32Z","lastTransitionTime":"2026-02-24T10:10:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 10:10:32 crc kubenswrapper[4985]: I0224 10:10:32.292653 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:10:32 crc kubenswrapper[4985]: I0224 10:10:32.292706 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:10:32 crc kubenswrapper[4985]: I0224 10:10:32.292717 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:10:32 crc kubenswrapper[4985]: I0224 10:10:32.292735 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 10:10:32 crc kubenswrapper[4985]: I0224 10:10:32.292745 4985 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T10:10:32Z","lastTransitionTime":"2026-02-24T10:10:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 10:10:32 crc kubenswrapper[4985]: I0224 10:10:32.395115 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:10:32 crc kubenswrapper[4985]: I0224 10:10:32.395157 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:10:32 crc kubenswrapper[4985]: I0224 10:10:32.395170 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:10:32 crc kubenswrapper[4985]: I0224 10:10:32.395187 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 10:10:32 crc kubenswrapper[4985]: I0224 10:10:32.395200 4985 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T10:10:32Z","lastTransitionTime":"2026-02-24T10:10:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 10:10:32 crc kubenswrapper[4985]: I0224 10:10:32.498063 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:10:32 crc kubenswrapper[4985]: I0224 10:10:32.498135 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:10:32 crc kubenswrapper[4985]: I0224 10:10:32.498153 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:10:32 crc kubenswrapper[4985]: I0224 10:10:32.498181 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 10:10:32 crc kubenswrapper[4985]: I0224 10:10:32.498200 4985 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T10:10:32Z","lastTransitionTime":"2026-02-24T10:10:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 10:10:32 crc kubenswrapper[4985]: I0224 10:10:32.601310 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:10:32 crc kubenswrapper[4985]: I0224 10:10:32.601385 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:10:32 crc kubenswrapper[4985]: I0224 10:10:32.601407 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:10:32 crc kubenswrapper[4985]: I0224 10:10:32.601437 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 10:10:32 crc kubenswrapper[4985]: I0224 10:10:32.601462 4985 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T10:10:32Z","lastTransitionTime":"2026-02-24T10:10:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 10:10:32 crc kubenswrapper[4985]: I0224 10:10:32.704394 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:10:32 crc kubenswrapper[4985]: I0224 10:10:32.704481 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:10:32 crc kubenswrapper[4985]: I0224 10:10:32.704512 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:10:32 crc kubenswrapper[4985]: I0224 10:10:32.704543 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 10:10:32 crc kubenswrapper[4985]: I0224 10:10:32.704566 4985 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T10:10:32Z","lastTransitionTime":"2026-02-24T10:10:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 10:10:32 crc kubenswrapper[4985]: I0224 10:10:32.749643 4985 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 24 10:10:32 crc kubenswrapper[4985]: I0224 10:10:32.766782 4985 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-xkc65" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d4340d1a-60cb-4240-87ba-1e468c9c41cf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lvcfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lvcfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:10:13Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-xkc65\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:10:32Z is after 2025-08-24T17:21:41Z" Feb 24 10:10:32 crc kubenswrapper[4985]: I0224 10:10:32.790187 4985 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-27dpt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1b3986ef-e9be-43db-9350-ccc7dd3f713f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://97f91c2ddd00ceb58ea5691c7cb057bd7cfe8add5fd1dfd334f39b9def30db0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:10:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9n6c9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://af211a7b5dbd1e1f86f340d63afe03ad2f9691d166810837cc5988a24ad4f323\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:10:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9n6c9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3d76cc1e57df7809a67ce69b6690fd8cbe0ef9c364406b03bbb37bd110d65866\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:10:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9n6c9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cb7b5b1f27320e4bc00e48154c3bb41b1153e36c39a404cfc1d4a246f5909c5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:10:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9n6c9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e04b766ff2b4eb366ace7cbf978ef84bfa922494ec3d28f4cbd75efe1cb54f1e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:10:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9n6c9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7630d2e046fcee78924010a8de1aefcdf7c24ef4e70c4ea89c193983a0d88a61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:10:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9n6c9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://374d551e69c9bf5e4a9f9cc851abb3d18629f675d0be6f68664926f579bb45fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://374d551e69c9bf5e4a9f9cc851abb3d18629f675d0be6f68664926f579bb45fe\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-24T10:10:24Z\\\",\\\"message\\\":\\\"772757 6847 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0224 10:10:24.772762 6847 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0224 10:10:24.772771 6847 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0224 10:10:24.772784 6847 factory.go:656] Stopping watch factory\\\\nI0224 10:10:24.772785 6847 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0224 10:10:24.772793 6847 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI0224 10:10:24.772811 6847 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0224 10:10:24.772881 6847 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI0224 10:10:24.772946 6847 ovnkube.go:599] Stopped ovnkube\\\\nI0224 10:10:24.773029 6847 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI0224 10:10:24.773081 6847 loadbalancer.go:304] Deleted 0 stale LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-kube-scheduler/scheduler\\\\\\\"}\\\\nF0224 10:10:24.773103 6847 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: failed to add event handler: handler {0x1e60340 0x1e60020 0x1e5ffc0} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network contr\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-24T10:10:23Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-27dpt_openshift-ovn-kubernetes(1b3986ef-e9be-43db-9350-ccc7dd3f713f)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9n6c9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e3faa6c1361aca1d102b6dbfaf38fa3094a5c2334713ce70a8bf2f58f19d6392\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:10:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9n6c9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8de70126733b3634f4e3c1576c3095c089e9628a6147ce8ea79fa5242a2eb920\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8de70126733b3634f4e3c1576c3095c089e9628a6147ce8ea79fa5242a2eb920\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T10:10:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9n6c9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:10:13Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-27dpt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:10:32Z is after 2025-08-24T17:21:41Z" Feb 24 10:10:32 crc kubenswrapper[4985]: I0224 10:10:32.805734 4985 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-qtqms" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c43e3252-1b22-48a0-8895-ad98fc88b7cf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://778506fd4d9af7b6ebd4e4d46282938146ba0184d9747ca657957c78118c10e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:10:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jdldr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:10:13Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-qtqms\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:10:32Z is after 2025-08-24T17:21:41Z" Feb 24 10:10:32 crc kubenswrapper[4985]: I0224 10:10:32.807054 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:10:32 crc kubenswrapper[4985]: I0224 10:10:32.807117 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:10:32 crc kubenswrapper[4985]: I0224 10:10:32.807134 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:10:32 crc kubenswrapper[4985]: I0224 10:10:32.807157 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 10:10:32 crc kubenswrapper[4985]: I0224 10:10:32.807174 4985 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T10:10:32Z","lastTransitionTime":"2026-02-24T10:10:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 10:10:32 crc kubenswrapper[4985]: I0224 10:10:32.829253 4985 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"42276a15-2a73-4687-91ad-d5ff65f8526d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:08:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:08:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:08:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:08:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:08:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://239bae44bdfe47737a87a7c7ffe103f138acfb89e1eb16c3901b25e8a0fa12e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:08:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aeb360380a66a7d4923e8db278af3a2024d5ed789e85e1427aaad600c2754224\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:08:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7968a5b1a705405cdca66b1c914475bd58b8c511980dfe312abd7ddf447d6375\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:08:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://813f41fed199f25206b4e475a175beaa3f3223c8b4f946e533f4290f64dbd069\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:08:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2a1fe7d1c64e5170ec0ce50343eb2a81b002e5b8bb4160c5ab4abafcc4ece67b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:08:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f5db1ab98b50bb081b07ffa0a8660ef886047d941d9b8509f484011950d0a94a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f5db1ab98b50bb081b07ffa0a8660ef886047d941d9b8509f484011950d0a94a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T10:08:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T10:08:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9edf4a1146efaf26193d3f8c63eb95cd945e7be5d36dc825d24053417ac5b1ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9edf4a1146efaf26193d3f8c63eb95cd945e7be5d36dc825d24053417ac5b1ac\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T10:08:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T10:08:38Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://2592e50d0dc75b9c89a94486e27887c6487e21a72ad53055b1b9af0cdfd73ec9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2592e50d0dc75b9c89a94486e27887c6487e21a72ad53055b1b9af0cdfd73ec9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T10:08:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T10:08:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:08:36Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:10:32Z is after 2025-08-24T17:21:41Z" Feb 24 10:10:32 crc kubenswrapper[4985]: I0224 10:10:32.844542 4985 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c9e0873cd1521fe3cd14005f22118db4616043670f4b895435e71ddec9d82a0e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:10:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:10:32Z is after 2025-08-24T17:21:41Z" Feb 24 10:10:32 crc kubenswrapper[4985]: I0224 10:10:32.857207 4985 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:16Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:16Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e2f9d78f6a52c55d1f70fa119f84fdb11505d511d394a8ed61a298e79dfb2dcd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:10:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:10:32Z is after 2025-08-24T17:21:41Z" Feb 24 10:10:32 crc kubenswrapper[4985]: I0224 10:10:32.871620 4985 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:10:32Z is after 2025-08-24T17:21:41Z" Feb 24 10:10:32 crc kubenswrapper[4985]: I0224 10:10:32.885234 4985 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:10:32Z is after 2025-08-24T17:21:41Z" Feb 24 10:10:32 crc kubenswrapper[4985]: I0224 10:10:32.898648 4985 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-q24bf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"731349d2-7b07-4bc9-81f8-c7d75bca842a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://85e965ad144b3440864b3c9de70aacd2a2e49c715dca204b0e4cfae65954cd9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:10:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5g9sh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:10:13Z\\\"}}\" for pod \"openshift-multus\"/\"multus-q24bf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:10:32Z is after 2025-08-24T17:21:41Z" Feb 24 10:10:32 crc kubenswrapper[4985]: I0224 10:10:32.910240 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:10:32 crc kubenswrapper[4985]: I0224 10:10:32.910308 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:10:32 crc kubenswrapper[4985]: I0224 10:10:32.910328 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:10:32 crc kubenswrapper[4985]: I0224 10:10:32.910355 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 10:10:32 crc kubenswrapper[4985]: I0224 10:10:32.910374 4985 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T10:10:32Z","lastTransitionTime":"2026-02-24T10:10:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 10:10:32 crc kubenswrapper[4985]: I0224 10:10:32.917163 4985 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-kkmm8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c3a34c00-910b-400b-bc96-9d805e076b7c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://65682d7b02ccbc0c344ab0b8eaf55d227d0b6e4a04e56d1038d51aa30747b068\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:10:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2zg8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://52f2dbaf1dfa5fd40bbf8a0d82613a96b05afccc71653f6d1f24db6674950b58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:10:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2zg8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:10:13Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-kkmm8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:10:32Z is after 2025-08-24T17:21:41Z" Feb 24 10:10:32 crc kubenswrapper[4985]: I0224 10:10:32.937945 4985 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a6395bdd-0d8e-4572-ae86-695e87aff12e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:08:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:08:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:08:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f9134e7cde3d6c701f928934ed4de91ee13bac8bc62dde5df6fe535a219fb57\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:08:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba53e3e1dfd92b0155e60254f21e53e06da9929560934d7bd95624f4a2bb619c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:08:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d059ea80117284da8c7eb82e4cd878eabba8f9d96102965708b9f6e0f8e96eb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:08:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dea2225b9a0112117f2f0bd6038ef636ac20e0cfcf3d608b720ad7aa2cacfa7c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6ee8b871ab25b79ee67f4e5735673469d19cfb336696170957d2439b0bca13af\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-24T10:09:32Z\\\",\\\"message\\\":\\\"le observer\\\\nW0224 10:09:31.800132 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0224 10:09:31.800281 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0224 10:09:31.801147 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1923306337/tls.crt::/tmp/serving-cert-1923306337/tls.key\\\\\\\"\\\\nI0224 10:09:31.990905 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0224 10:09:31.993717 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0224 10:09:31.993747 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0224 10:09:31.993768 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0224 10:09:31.993775 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0224 10:09:31.997392 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0224 10:09:31.997421 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0224 10:09:31.997427 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0224 10:09:31.997436 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nI0224 10:09:31.997435 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0224 10:09:31.997440 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0224 10:09:31.997467 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0224 10:09:31.997470 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0224 10:09:31.999573 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-24T10:09:31Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:10:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5cf6a4886c046added0a4380f1ca023def94c15e9d5c0ed472c714632684f158\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:08:39Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8da8e0d5478ae714a89c61cf4ca4142bc433bca4d4e4749f069c706abfed8c8f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8da8e0d5478ae714a89c61cf4ca4142bc433bca4d4e4749f069c706abfed8c8f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T10:08:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T10:08:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:08:36Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:10:32Z is after 2025-08-24T17:21:41Z" Feb 24 10:10:32 crc kubenswrapper[4985]: I0224 10:10:32.952127 4985 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-xj9h5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c4bc690c-38e6-488d-97b2-9bf37a916fe4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ab4787f21b4cef49e96547a1d1e7105c79d01d7863afd7ea013f9a38cb40dd06\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:10:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svj5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5aa381e3ea366641ef4fa423fdd46f8afb44b1b2a2ccb6c754007456149b75e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5aa381e3ea366641ef4fa423fdd46f8afb44b1b2a2ccb6c754007456149b75e6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T10:10:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svj5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e071fe863c4e9eb6ebecb3d58d1c4a3c66577cbe46568a4d80a5de5663636539\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e071fe863c4e9eb6ebecb3d58d1c4a3c66577cbe46568a4d80a5de5663636539\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T10:10:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T10:10:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svj5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ac924a3bb0825085e397dbc788641894675e9556fd8aeadcbf9a24a65e1a076\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4ac924a3bb0825085e397dbc788641894675e9556fd8aeadcbf9a24a65e1a076\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T10:10:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T10:10:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svj5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://91acce3dbd19205b8faf23208b7d65c0c47d8156eaf6d7d3357ce1ca64a3bb88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://91acce3dbd19205b8faf23208b7d65c0c47d8156eaf6d7d3357ce1ca64a3bb88\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T10:10:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T10:10:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svj5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://81482396eb82b6ef2f09d03cf5e6bc785209394a50c63b199581fc70120bb9fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://81482396eb82b6ef2f09d03cf5e6bc785209394a50c63b199581fc70120bb9fd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T10:10:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T10:10:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svj5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://556b84b0e9e72b0c26c58defa2c030943daffa83d97b60293cc5b5627a6c9174\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://556b84b0e9e72b0c26c58defa2c030943daffa83d97b60293cc5b5627a6c9174\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T10:10:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T10:10:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svj5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:10:13Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-xj9h5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:10:32Z is after 2025-08-24T17:21:41Z" Feb 24 10:10:32 crc kubenswrapper[4985]: I0224 10:10:32.964874 4985 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-hq52w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"11c1c7b8-18df-4583-849f-76b62544344b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8546cf975b145528b7f94bd03c6570de0db1a059477da5795b7569250bde3ca5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:10:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kcfbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://14cb7316040a9cb3cb3236ed16ecc17eee99c6935a646b86604ea8285e1aab0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:10:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kcfbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:10:13Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-hq52w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:10:32Z is after 2025-08-24T17:21:41Z" Feb 24 10:10:32 crc kubenswrapper[4985]: I0224 10:10:32.975740 4985 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-jj7jq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4776cd42-9a5c-4c2e-a585-a1f4a49d2d6d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6f45d6400640ba9b59793abc9aee79bb1c32439ff377abdf8b4ec4d1cd7c336a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:10:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rvmf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:10:13Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-jj7jq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:10:32Z is after 2025-08-24T17:21:41Z" Feb 24 10:10:32 crc kubenswrapper[4985]: I0224 10:10:32.988045 4985 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:10:32Z is after 2025-08-24T17:21:41Z" Feb 24 10:10:33 crc kubenswrapper[4985]: I0224 10:10:33.000247 4985 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://137c9b42aa2109e16d0e273e3e7d116db30ed716c1e6f78fb8f4e87409e3f252\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:10:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://40736fa6ee59064a7fc5bed9b08671d094e7a11b6a8bf4ec45220a0d62156542\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:10:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:10:32Z is after 2025-08-24T17:21:41Z" Feb 24 10:10:33 crc kubenswrapper[4985]: I0224 10:10:33.012826 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:10:33 crc kubenswrapper[4985]: I0224 10:10:33.012864 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:10:33 crc kubenswrapper[4985]: I0224 10:10:33.012874 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:10:33 crc kubenswrapper[4985]: I0224 10:10:33.012909 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 10:10:33 crc kubenswrapper[4985]: I0224 10:10:33.012924 4985 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T10:10:33Z","lastTransitionTime":"2026-02-24T10:10:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 10:10:33 crc kubenswrapper[4985]: I0224 10:10:33.115252 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:10:33 crc kubenswrapper[4985]: I0224 10:10:33.115292 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:10:33 crc kubenswrapper[4985]: I0224 10:10:33.115300 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:10:33 crc kubenswrapper[4985]: I0224 10:10:33.115315 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 10:10:33 crc kubenswrapper[4985]: I0224 10:10:33.115325 4985 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T10:10:33Z","lastTransitionTime":"2026-02-24T10:10:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 10:10:33 crc kubenswrapper[4985]: I0224 10:10:33.217662 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:10:33 crc kubenswrapper[4985]: I0224 10:10:33.217712 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:10:33 crc kubenswrapper[4985]: I0224 10:10:33.217723 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:10:33 crc kubenswrapper[4985]: I0224 10:10:33.217740 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 10:10:33 crc kubenswrapper[4985]: I0224 10:10:33.217751 4985 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T10:10:33Z","lastTransitionTime":"2026-02-24T10:10:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 10:10:33 crc kubenswrapper[4985]: I0224 10:10:33.263673 4985 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 24 10:10:33 crc kubenswrapper[4985]: I0224 10:10:33.263729 4985 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 24 10:10:33 crc kubenswrapper[4985]: I0224 10:10:33.263744 4985 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 24 10:10:33 crc kubenswrapper[4985]: I0224 10:10:33.263756 4985 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xkc65" Feb 24 10:10:33 crc kubenswrapper[4985]: E0224 10:10:33.263838 4985 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 24 10:10:33 crc kubenswrapper[4985]: E0224 10:10:33.264117 4985 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 24 10:10:33 crc kubenswrapper[4985]: E0224 10:10:33.264226 4985 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 24 10:10:33 crc kubenswrapper[4985]: E0224 10:10:33.264315 4985 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xkc65" podUID="d4340d1a-60cb-4240-87ba-1e468c9c41cf" Feb 24 10:10:33 crc kubenswrapper[4985]: I0224 10:10:33.319638 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:10:33 crc kubenswrapper[4985]: I0224 10:10:33.319677 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:10:33 crc kubenswrapper[4985]: I0224 10:10:33.319686 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:10:33 crc kubenswrapper[4985]: I0224 10:10:33.319704 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 10:10:33 crc kubenswrapper[4985]: I0224 10:10:33.319717 4985 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T10:10:33Z","lastTransitionTime":"2026-02-24T10:10:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 10:10:33 crc kubenswrapper[4985]: I0224 10:10:33.422347 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:10:33 crc kubenswrapper[4985]: I0224 10:10:33.422394 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:10:33 crc kubenswrapper[4985]: I0224 10:10:33.422406 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:10:33 crc kubenswrapper[4985]: I0224 10:10:33.422422 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 10:10:33 crc kubenswrapper[4985]: I0224 10:10:33.422435 4985 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T10:10:33Z","lastTransitionTime":"2026-02-24T10:10:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 10:10:33 crc kubenswrapper[4985]: I0224 10:10:33.524560 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:10:33 crc kubenswrapper[4985]: I0224 10:10:33.524616 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:10:33 crc kubenswrapper[4985]: I0224 10:10:33.524635 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:10:33 crc kubenswrapper[4985]: I0224 10:10:33.524658 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 10:10:33 crc kubenswrapper[4985]: I0224 10:10:33.524675 4985 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T10:10:33Z","lastTransitionTime":"2026-02-24T10:10:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 10:10:33 crc kubenswrapper[4985]: I0224 10:10:33.627089 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:10:33 crc kubenswrapper[4985]: I0224 10:10:33.627160 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:10:33 crc kubenswrapper[4985]: I0224 10:10:33.627178 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:10:33 crc kubenswrapper[4985]: I0224 10:10:33.627205 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 10:10:33 crc kubenswrapper[4985]: I0224 10:10:33.627224 4985 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T10:10:33Z","lastTransitionTime":"2026-02-24T10:10:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 10:10:33 crc kubenswrapper[4985]: I0224 10:10:33.730209 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:10:33 crc kubenswrapper[4985]: I0224 10:10:33.730242 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:10:33 crc kubenswrapper[4985]: I0224 10:10:33.730300 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:10:33 crc kubenswrapper[4985]: I0224 10:10:33.730315 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 10:10:33 crc kubenswrapper[4985]: I0224 10:10:33.730324 4985 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T10:10:33Z","lastTransitionTime":"2026-02-24T10:10:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 10:10:33 crc kubenswrapper[4985]: I0224 10:10:33.832648 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:10:33 crc kubenswrapper[4985]: I0224 10:10:33.832683 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:10:33 crc kubenswrapper[4985]: I0224 10:10:33.832692 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:10:33 crc kubenswrapper[4985]: I0224 10:10:33.832705 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 10:10:33 crc kubenswrapper[4985]: I0224 10:10:33.832714 4985 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T10:10:33Z","lastTransitionTime":"2026-02-24T10:10:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 10:10:33 crc kubenswrapper[4985]: I0224 10:10:33.934926 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:10:33 crc kubenswrapper[4985]: I0224 10:10:33.935010 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:10:33 crc kubenswrapper[4985]: I0224 10:10:33.935031 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:10:33 crc kubenswrapper[4985]: I0224 10:10:33.935055 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 10:10:33 crc kubenswrapper[4985]: I0224 10:10:33.935071 4985 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T10:10:33Z","lastTransitionTime":"2026-02-24T10:10:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 10:10:34 crc kubenswrapper[4985]: I0224 10:10:34.037291 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:10:34 crc kubenswrapper[4985]: I0224 10:10:34.037340 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:10:34 crc kubenswrapper[4985]: I0224 10:10:34.037352 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:10:34 crc kubenswrapper[4985]: I0224 10:10:34.037365 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 10:10:34 crc kubenswrapper[4985]: I0224 10:10:34.037374 4985 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T10:10:34Z","lastTransitionTime":"2026-02-24T10:10:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 10:10:34 crc kubenswrapper[4985]: I0224 10:10:34.140133 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:10:34 crc kubenswrapper[4985]: I0224 10:10:34.140175 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:10:34 crc kubenswrapper[4985]: I0224 10:10:34.140191 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:10:34 crc kubenswrapper[4985]: I0224 10:10:34.140205 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 10:10:34 crc kubenswrapper[4985]: I0224 10:10:34.140214 4985 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T10:10:34Z","lastTransitionTime":"2026-02-24T10:10:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 10:10:34 crc kubenswrapper[4985]: I0224 10:10:34.242269 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:10:34 crc kubenswrapper[4985]: I0224 10:10:34.242311 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:10:34 crc kubenswrapper[4985]: I0224 10:10:34.242324 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:10:34 crc kubenswrapper[4985]: I0224 10:10:34.242341 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 10:10:34 crc kubenswrapper[4985]: I0224 10:10:34.242351 4985 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T10:10:34Z","lastTransitionTime":"2026-02-24T10:10:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 10:10:34 crc kubenswrapper[4985]: I0224 10:10:34.344083 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:10:34 crc kubenswrapper[4985]: I0224 10:10:34.344136 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:10:34 crc kubenswrapper[4985]: I0224 10:10:34.344150 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:10:34 crc kubenswrapper[4985]: I0224 10:10:34.344172 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 10:10:34 crc kubenswrapper[4985]: I0224 10:10:34.344188 4985 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T10:10:34Z","lastTransitionTime":"2026-02-24T10:10:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 10:10:34 crc kubenswrapper[4985]: I0224 10:10:34.446614 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:10:34 crc kubenswrapper[4985]: I0224 10:10:34.446664 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:10:34 crc kubenswrapper[4985]: I0224 10:10:34.446678 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:10:34 crc kubenswrapper[4985]: I0224 10:10:34.446698 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 10:10:34 crc kubenswrapper[4985]: I0224 10:10:34.446713 4985 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T10:10:34Z","lastTransitionTime":"2026-02-24T10:10:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 10:10:34 crc kubenswrapper[4985]: I0224 10:10:34.549172 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:10:34 crc kubenswrapper[4985]: I0224 10:10:34.549233 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:10:34 crc kubenswrapper[4985]: I0224 10:10:34.549244 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:10:34 crc kubenswrapper[4985]: I0224 10:10:34.549258 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 10:10:34 crc kubenswrapper[4985]: I0224 10:10:34.549267 4985 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T10:10:34Z","lastTransitionTime":"2026-02-24T10:10:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 10:10:34 crc kubenswrapper[4985]: I0224 10:10:34.651837 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:10:34 crc kubenswrapper[4985]: I0224 10:10:34.651938 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:10:34 crc kubenswrapper[4985]: I0224 10:10:34.651959 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:10:34 crc kubenswrapper[4985]: I0224 10:10:34.651986 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 10:10:34 crc kubenswrapper[4985]: I0224 10:10:34.652004 4985 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T10:10:34Z","lastTransitionTime":"2026-02-24T10:10:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 10:10:34 crc kubenswrapper[4985]: I0224 10:10:34.755357 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:10:34 crc kubenswrapper[4985]: I0224 10:10:34.755438 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:10:34 crc kubenswrapper[4985]: I0224 10:10:34.755461 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:10:34 crc kubenswrapper[4985]: I0224 10:10:34.755491 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 10:10:34 crc kubenswrapper[4985]: I0224 10:10:34.755513 4985 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T10:10:34Z","lastTransitionTime":"2026-02-24T10:10:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 10:10:34 crc kubenswrapper[4985]: I0224 10:10:34.859253 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:10:34 crc kubenswrapper[4985]: I0224 10:10:34.859337 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:10:34 crc kubenswrapper[4985]: I0224 10:10:34.859370 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:10:34 crc kubenswrapper[4985]: I0224 10:10:34.859400 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 10:10:34 crc kubenswrapper[4985]: I0224 10:10:34.859423 4985 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T10:10:34Z","lastTransitionTime":"2026-02-24T10:10:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 10:10:34 crc kubenswrapper[4985]: I0224 10:10:34.962144 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:10:34 crc kubenswrapper[4985]: I0224 10:10:34.962222 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:10:34 crc kubenswrapper[4985]: I0224 10:10:34.962248 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:10:34 crc kubenswrapper[4985]: I0224 10:10:34.962280 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 10:10:34 crc kubenswrapper[4985]: I0224 10:10:34.962303 4985 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T10:10:34Z","lastTransitionTime":"2026-02-24T10:10:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 10:10:35 crc kubenswrapper[4985]: I0224 10:10:35.065841 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:10:35 crc kubenswrapper[4985]: I0224 10:10:35.065933 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:10:35 crc kubenswrapper[4985]: I0224 10:10:35.065953 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:10:35 crc kubenswrapper[4985]: I0224 10:10:35.065978 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 10:10:35 crc kubenswrapper[4985]: I0224 10:10:35.065995 4985 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T10:10:35Z","lastTransitionTime":"2026-02-24T10:10:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 10:10:35 crc kubenswrapper[4985]: I0224 10:10:35.168855 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:10:35 crc kubenswrapper[4985]: I0224 10:10:35.168954 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:10:35 crc kubenswrapper[4985]: I0224 10:10:35.168974 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:10:35 crc kubenswrapper[4985]: I0224 10:10:35.169052 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 10:10:35 crc kubenswrapper[4985]: I0224 10:10:35.169070 4985 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T10:10:35Z","lastTransitionTime":"2026-02-24T10:10:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 10:10:35 crc kubenswrapper[4985]: I0224 10:10:35.264554 4985 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xkc65" Feb 24 10:10:35 crc kubenswrapper[4985]: I0224 10:10:35.264621 4985 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 24 10:10:35 crc kubenswrapper[4985]: I0224 10:10:35.264642 4985 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 24 10:10:35 crc kubenswrapper[4985]: I0224 10:10:35.264701 4985 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 24 10:10:35 crc kubenswrapper[4985]: E0224 10:10:35.264843 4985 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xkc65" podUID="d4340d1a-60cb-4240-87ba-1e468c9c41cf" Feb 24 10:10:35 crc kubenswrapper[4985]: E0224 10:10:35.265061 4985 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 24 10:10:35 crc kubenswrapper[4985]: E0224 10:10:35.265193 4985 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 24 10:10:35 crc kubenswrapper[4985]: E0224 10:10:35.265227 4985 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 24 10:10:35 crc kubenswrapper[4985]: I0224 10:10:35.271189 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:10:35 crc kubenswrapper[4985]: I0224 10:10:35.271218 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:10:35 crc kubenswrapper[4985]: I0224 10:10:35.271230 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:10:35 crc kubenswrapper[4985]: I0224 10:10:35.271245 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 10:10:35 crc kubenswrapper[4985]: I0224 10:10:35.271257 4985 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T10:10:35Z","lastTransitionTime":"2026-02-24T10:10:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 10:10:35 crc kubenswrapper[4985]: I0224 10:10:35.373744 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:10:35 crc kubenswrapper[4985]: I0224 10:10:35.373778 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:10:35 crc kubenswrapper[4985]: I0224 10:10:35.373786 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:10:35 crc kubenswrapper[4985]: I0224 10:10:35.373799 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 10:10:35 crc kubenswrapper[4985]: I0224 10:10:35.373808 4985 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T10:10:35Z","lastTransitionTime":"2026-02-24T10:10:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 10:10:35 crc kubenswrapper[4985]: I0224 10:10:35.475819 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:10:35 crc kubenswrapper[4985]: I0224 10:10:35.475909 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:10:35 crc kubenswrapper[4985]: I0224 10:10:35.475929 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:10:35 crc kubenswrapper[4985]: I0224 10:10:35.475955 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 10:10:35 crc kubenswrapper[4985]: I0224 10:10:35.475971 4985 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T10:10:35Z","lastTransitionTime":"2026-02-24T10:10:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 10:10:35 crc kubenswrapper[4985]: I0224 10:10:35.578301 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:10:35 crc kubenswrapper[4985]: I0224 10:10:35.578349 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:10:35 crc kubenswrapper[4985]: I0224 10:10:35.578360 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:10:35 crc kubenswrapper[4985]: I0224 10:10:35.578378 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 10:10:35 crc kubenswrapper[4985]: I0224 10:10:35.578392 4985 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T10:10:35Z","lastTransitionTime":"2026-02-24T10:10:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 10:10:35 crc kubenswrapper[4985]: I0224 10:10:35.680704 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:10:35 crc kubenswrapper[4985]: I0224 10:10:35.680739 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:10:35 crc kubenswrapper[4985]: I0224 10:10:35.680750 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:10:35 crc kubenswrapper[4985]: I0224 10:10:35.680767 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 10:10:35 crc kubenswrapper[4985]: I0224 10:10:35.680779 4985 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T10:10:35Z","lastTransitionTime":"2026-02-24T10:10:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 10:10:35 crc kubenswrapper[4985]: I0224 10:10:35.782981 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:10:35 crc kubenswrapper[4985]: I0224 10:10:35.783031 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:10:35 crc kubenswrapper[4985]: I0224 10:10:35.783046 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:10:35 crc kubenswrapper[4985]: I0224 10:10:35.783064 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 10:10:35 crc kubenswrapper[4985]: I0224 10:10:35.783076 4985 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T10:10:35Z","lastTransitionTime":"2026-02-24T10:10:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 10:10:35 crc kubenswrapper[4985]: I0224 10:10:35.885422 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:10:35 crc kubenswrapper[4985]: I0224 10:10:35.885460 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:10:35 crc kubenswrapper[4985]: I0224 10:10:35.885469 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:10:35 crc kubenswrapper[4985]: I0224 10:10:35.885484 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 10:10:35 crc kubenswrapper[4985]: I0224 10:10:35.885495 4985 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T10:10:35Z","lastTransitionTime":"2026-02-24T10:10:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 10:10:35 crc kubenswrapper[4985]: I0224 10:10:35.987702 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:10:35 crc kubenswrapper[4985]: I0224 10:10:35.987767 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:10:35 crc kubenswrapper[4985]: I0224 10:10:35.987780 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:10:35 crc kubenswrapper[4985]: I0224 10:10:35.987799 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 10:10:35 crc kubenswrapper[4985]: I0224 10:10:35.987816 4985 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T10:10:35Z","lastTransitionTime":"2026-02-24T10:10:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 10:10:36 crc kubenswrapper[4985]: I0224 10:10:36.090647 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:10:36 crc kubenswrapper[4985]: I0224 10:10:36.090702 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:10:36 crc kubenswrapper[4985]: I0224 10:10:36.090719 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:10:36 crc kubenswrapper[4985]: I0224 10:10:36.090741 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 10:10:36 crc kubenswrapper[4985]: I0224 10:10:36.090758 4985 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T10:10:36Z","lastTransitionTime":"2026-02-24T10:10:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 10:10:36 crc kubenswrapper[4985]: I0224 10:10:36.193857 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:10:36 crc kubenswrapper[4985]: I0224 10:10:36.193918 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:10:36 crc kubenswrapper[4985]: I0224 10:10:36.193931 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:10:36 crc kubenswrapper[4985]: I0224 10:10:36.193945 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 10:10:36 crc kubenswrapper[4985]: I0224 10:10:36.193955 4985 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T10:10:36Z","lastTransitionTime":"2026-02-24T10:10:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 10:10:36 crc kubenswrapper[4985]: I0224 10:10:36.275463 4985 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:10:36Z is after 2025-08-24T17:21:41Z" Feb 24 10:10:36 crc kubenswrapper[4985]: I0224 10:10:36.286331 4985 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:10:36Z is after 2025-08-24T17:21:41Z" Feb 24 10:10:36 crc kubenswrapper[4985]: E0224 10:10:36.294431 4985 kubelet_node_status.go:497] "Node not becoming ready in time after startup" Feb 24 10:10:36 crc kubenswrapper[4985]: I0224 10:10:36.297693 4985 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-q24bf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"731349d2-7b07-4bc9-81f8-c7d75bca842a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://85e965ad144b3440864b3c9de70aacd2a2e49c715dca204b0e4cfae65954cd9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:10:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5g9sh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:10:13Z\\\"}}\" for pod \"openshift-multus\"/\"multus-q24bf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:10:36Z is after 2025-08-24T17:21:41Z" Feb 24 10:10:36 crc kubenswrapper[4985]: I0224 10:10:36.307374 4985 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-kkmm8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c3a34c00-910b-400b-bc96-9d805e076b7c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://65682d7b02ccbc0c344ab0b8eaf55d227d0b6e4a04e56d1038d51aa30747b068\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:10:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2zg8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://52f2dbaf1dfa5fd40bbf8a0d82613a96b05afccc71653f6d1f24db6674950b58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:10:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2zg8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:10:13Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-kkmm8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:10:36Z is after 2025-08-24T17:21:41Z" Feb 24 10:10:36 crc kubenswrapper[4985]: I0224 10:10:36.338585 4985 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a6395bdd-0d8e-4572-ae86-695e87aff12e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:08:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:08:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:08:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f9134e7cde3d6c701f928934ed4de91ee13bac8bc62dde5df6fe535a219fb57\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:08:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba53e3e1dfd92b0155e60254f21e53e06da9929560934d7bd95624f4a2bb619c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:08:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d059ea80117284da8c7eb82e4cd878eabba8f9d96102965708b9f6e0f8e96eb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:08:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dea2225b9a0112117f2f0bd6038ef636ac20e0cfcf3d608b720ad7aa2cacfa7c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6ee8b871ab25b79ee67f4e5735673469d19cfb336696170957d2439b0bca13af\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-24T10:09:32Z\\\",\\\"message\\\":\\\"le observer\\\\nW0224 10:09:31.800132 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0224 10:09:31.800281 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0224 10:09:31.801147 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1923306337/tls.crt::/tmp/serving-cert-1923306337/tls.key\\\\\\\"\\\\nI0224 10:09:31.990905 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0224 10:09:31.993717 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0224 10:09:31.993747 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0224 10:09:31.993768 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0224 10:09:31.993775 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0224 10:09:31.997392 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0224 10:09:31.997421 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0224 10:09:31.997427 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0224 10:09:31.997436 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nI0224 10:09:31.997435 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0224 10:09:31.997440 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0224 10:09:31.997467 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0224 10:09:31.997470 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0224 10:09:31.999573 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-24T10:09:31Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:10:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5cf6a4886c046added0a4380f1ca023def94c15e9d5c0ed472c714632684f158\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:08:39Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8da8e0d5478ae714a89c61cf4ca4142bc433bca4d4e4749f069c706abfed8c8f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8da8e0d5478ae714a89c61cf4ca4142bc433bca4d4e4749f069c706abfed8c8f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T10:08:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T10:08:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:08:36Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:10:36Z is after 2025-08-24T17:21:41Z" Feb 24 10:10:36 crc kubenswrapper[4985]: E0224 10:10:36.361337 4985 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 24 10:10:36 crc kubenswrapper[4985]: I0224 10:10:36.370738 4985 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-xj9h5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c4bc690c-38e6-488d-97b2-9bf37a916fe4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ab4787f21b4cef49e96547a1d1e7105c79d01d7863afd7ea013f9a38cb40dd06\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:10:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svj5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5aa381e3ea366641ef4fa423fdd46f8afb44b1b2a2ccb6c754007456149b75e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5aa381e3ea366641ef4fa423fdd46f8afb44b1b2a2ccb6c754007456149b75e6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T10:10:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svj5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e071fe863c4e9eb6ebecb3d58d1c4a3c66577cbe46568a4d80a5de5663636539\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e071fe863c4e9eb6ebecb3d58d1c4a3c66577cbe46568a4d80a5de5663636539\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T10:10:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T10:10:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svj5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ac924a3bb0825085e397dbc788641894675e9556fd8aeadcbf9a24a65e1a076\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4ac924a3bb0825085e397dbc788641894675e9556fd8aeadcbf9a24a65e1a076\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T10:10:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T10:10:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svj5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://91acce3dbd19205b8faf23208b7d65c0c47d8156eaf6d7d3357ce1ca64a3bb88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://91acce3dbd19205b8faf23208b7d65c0c47d8156eaf6d7d3357ce1ca64a3bb88\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T10:10:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T10:10:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svj5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://81482396eb82b6ef2f09d03cf5e6bc785209394a50c63b199581fc70120bb9fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://81482396eb82b6ef2f09d03cf5e6bc785209394a50c63b199581fc70120bb9fd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T10:10:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T10:10:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svj5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://556b84b0e9e72b0c26c58defa2c030943daffa83d97b60293cc5b5627a6c9174\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://556b84b0e9e72b0c26c58defa2c030943daffa83d97b60293cc5b5627a6c9174\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T10:10:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T10:10:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svj5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:10:13Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-xj9h5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:10:36Z is after 2025-08-24T17:21:41Z" Feb 24 10:10:36 crc kubenswrapper[4985]: I0224 10:10:36.382912 4985 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-hq52w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"11c1c7b8-18df-4583-849f-76b62544344b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8546cf975b145528b7f94bd03c6570de0db1a059477da5795b7569250bde3ca5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:10:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kcfbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://14cb7316040a9cb3cb3236ed16ecc17eee99c6935a646b86604ea8285e1aab0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:10:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kcfbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:10:13Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-hq52w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:10:36Z is after 2025-08-24T17:21:41Z" Feb 24 10:10:36 crc kubenswrapper[4985]: I0224 10:10:36.392464 4985 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-jj7jq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4776cd42-9a5c-4c2e-a585-a1f4a49d2d6d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6f45d6400640ba9b59793abc9aee79bb1c32439ff377abdf8b4ec4d1cd7c336a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:10:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rvmf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:10:13Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-jj7jq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:10:36Z is after 2025-08-24T17:21:41Z" Feb 24 10:10:36 crc kubenswrapper[4985]: I0224 10:10:36.404989 4985 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:10:36Z is after 2025-08-24T17:21:41Z" Feb 24 10:10:36 crc kubenswrapper[4985]: I0224 10:10:36.415185 4985 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://137c9b42aa2109e16d0e273e3e7d116db30ed716c1e6f78fb8f4e87409e3f252\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:10:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://40736fa6ee59064a7fc5bed9b08671d094e7a11b6a8bf4ec45220a0d62156542\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:10:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:10:36Z is after 2025-08-24T17:21:41Z" Feb 24 10:10:36 crc kubenswrapper[4985]: I0224 10:10:36.423158 4985 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-qtqms" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c43e3252-1b22-48a0-8895-ad98fc88b7cf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://778506fd4d9af7b6ebd4e4d46282938146ba0184d9747ca657957c78118c10e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:10:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jdldr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:10:13Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-qtqms\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:10:36Z is after 2025-08-24T17:21:41Z" Feb 24 10:10:36 crc kubenswrapper[4985]: I0224 10:10:36.440488 4985 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"42276a15-2a73-4687-91ad-d5ff65f8526d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:08:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:08:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:08:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:08:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:08:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://239bae44bdfe47737a87a7c7ffe103f138acfb89e1eb16c3901b25e8a0fa12e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:08:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aeb360380a66a7d4923e8db278af3a2024d5ed789e85e1427aaad600c2754224\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:08:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7968a5b1a705405cdca66b1c914475bd58b8c511980dfe312abd7ddf447d6375\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:08:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://813f41fed199f25206b4e475a175beaa3f3223c8b4f946e533f4290f64dbd069\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:08:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2a1fe7d1c64e5170ec0ce50343eb2a81b002e5b8bb4160c5ab4abafcc4ece67b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:08:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f5db1ab98b50bb081b07ffa0a8660ef886047d941d9b8509f484011950d0a94a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f5db1ab98b50bb081b07ffa0a8660ef886047d941d9b8509f484011950d0a94a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T10:08:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T10:08:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9edf4a1146efaf26193d3f8c63eb95cd945e7be5d36dc825d24053417ac5b1ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9edf4a1146efaf26193d3f8c63eb95cd945e7be5d36dc825d24053417ac5b1ac\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T10:08:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T10:08:38Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://2592e50d0dc75b9c89a94486e27887c6487e21a72ad53055b1b9af0cdfd73ec9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2592e50d0dc75b9c89a94486e27887c6487e21a72ad53055b1b9af0cdfd73ec9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T10:08:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T10:08:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:08:36Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:10:36Z is after 2025-08-24T17:21:41Z" Feb 24 10:10:36 crc kubenswrapper[4985]: I0224 10:10:36.450798 4985 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c9e0873cd1521fe3cd14005f22118db4616043670f4b895435e71ddec9d82a0e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:10:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:10:36Z is after 2025-08-24T17:21:41Z" Feb 24 10:10:36 crc kubenswrapper[4985]: I0224 10:10:36.474312 4985 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:16Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:16Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e2f9d78f6a52c55d1f70fa119f84fdb11505d511d394a8ed61a298e79dfb2dcd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:10:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:10:36Z is after 2025-08-24T17:21:41Z" Feb 24 10:10:36 crc kubenswrapper[4985]: I0224 10:10:36.485365 4985 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-xkc65" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d4340d1a-60cb-4240-87ba-1e468c9c41cf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lvcfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lvcfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:10:13Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-xkc65\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:10:36Z is after 2025-08-24T17:21:41Z" Feb 24 10:10:36 crc kubenswrapper[4985]: I0224 10:10:36.503406 4985 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-27dpt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1b3986ef-e9be-43db-9350-ccc7dd3f713f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://97f91c2ddd00ceb58ea5691c7cb057bd7cfe8add5fd1dfd334f39b9def30db0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:10:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9n6c9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://af211a7b5dbd1e1f86f340d63afe03ad2f9691d166810837cc5988a24ad4f323\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:10:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9n6c9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3d76cc1e57df7809a67ce69b6690fd8cbe0ef9c364406b03bbb37bd110d65866\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:10:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9n6c9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cb7b5b1f27320e4bc00e48154c3bb41b1153e36c39a404cfc1d4a246f5909c5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:10:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9n6c9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e04b766ff2b4eb366ace7cbf978ef84bfa922494ec3d28f4cbd75efe1cb54f1e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:10:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9n6c9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7630d2e046fcee78924010a8de1aefcdf7c24ef4e70c4ea89c193983a0d88a61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:10:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9n6c9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://374d551e69c9bf5e4a9f9cc851abb3d18629f675d0be6f68664926f579bb45fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://374d551e69c9bf5e4a9f9cc851abb3d18629f675d0be6f68664926f579bb45fe\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-24T10:10:24Z\\\",\\\"message\\\":\\\"772757 6847 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0224 10:10:24.772762 6847 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0224 10:10:24.772771 6847 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0224 10:10:24.772784 6847 factory.go:656] Stopping watch factory\\\\nI0224 10:10:24.772785 6847 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0224 10:10:24.772793 6847 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI0224 10:10:24.772811 6847 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0224 10:10:24.772881 6847 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI0224 10:10:24.772946 6847 ovnkube.go:599] Stopped ovnkube\\\\nI0224 10:10:24.773029 6847 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI0224 10:10:24.773081 6847 loadbalancer.go:304] Deleted 0 stale LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-kube-scheduler/scheduler\\\\\\\"}\\\\nF0224 10:10:24.773103 6847 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: failed to add event handler: handler {0x1e60340 0x1e60020 0x1e5ffc0} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network contr\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-24T10:10:23Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-27dpt_openshift-ovn-kubernetes(1b3986ef-e9be-43db-9350-ccc7dd3f713f)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9n6c9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e3faa6c1361aca1d102b6dbfaf38fa3094a5c2334713ce70a8bf2f58f19d6392\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:10:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9n6c9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8de70126733b3634f4e3c1576c3095c089e9628a6147ce8ea79fa5242a2eb920\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8de70126733b3634f4e3c1576c3095c089e9628a6147ce8ea79fa5242a2eb920\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T10:10:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9n6c9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:10:13Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-27dpt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:10:36Z is after 2025-08-24T17:21:41Z" Feb 24 10:10:37 crc kubenswrapper[4985]: I0224 10:10:37.264411 4985 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 24 10:10:37 crc kubenswrapper[4985]: I0224 10:10:37.264454 4985 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 24 10:10:37 crc kubenswrapper[4985]: E0224 10:10:37.264515 4985 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 24 10:10:37 crc kubenswrapper[4985]: I0224 10:10:37.264563 4985 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xkc65" Feb 24 10:10:37 crc kubenswrapper[4985]: E0224 10:10:37.264742 4985 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xkc65" podUID="d4340d1a-60cb-4240-87ba-1e468c9c41cf" Feb 24 10:10:37 crc kubenswrapper[4985]: E0224 10:10:37.264836 4985 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 24 10:10:37 crc kubenswrapper[4985]: I0224 10:10:37.264424 4985 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 24 10:10:37 crc kubenswrapper[4985]: E0224 10:10:37.265101 4985 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 24 10:10:37 crc kubenswrapper[4985]: I0224 10:10:37.651632 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:10:37 crc kubenswrapper[4985]: I0224 10:10:37.651674 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:10:37 crc kubenswrapper[4985]: I0224 10:10:37.651681 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:10:37 crc kubenswrapper[4985]: I0224 10:10:37.651695 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 10:10:37 crc kubenswrapper[4985]: I0224 10:10:37.651703 4985 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T10:10:37Z","lastTransitionTime":"2026-02-24T10:10:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 10:10:37 crc kubenswrapper[4985]: E0224 10:10:37.667044 4985 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T10:10:37Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:37Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T10:10:37Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:37Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T10:10:37Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:37Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T10:10:37Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:37Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"6fa96d71-6980-4f45-b02a-32602082091f\\\",\\\"systemUUID\\\":\\\"77848161-2cfb-4f0a-8e60-0ac9426710fd\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:10:37Z is after 2025-08-24T17:21:41Z" Feb 24 10:10:37 crc kubenswrapper[4985]: I0224 10:10:37.670330 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:10:37 crc kubenswrapper[4985]: I0224 10:10:37.670378 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:10:37 crc kubenswrapper[4985]: I0224 10:10:37.670428 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:10:37 crc kubenswrapper[4985]: I0224 10:10:37.670451 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 10:10:37 crc kubenswrapper[4985]: I0224 10:10:37.670466 4985 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T10:10:37Z","lastTransitionTime":"2026-02-24T10:10:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 10:10:37 crc kubenswrapper[4985]: E0224 10:10:37.689687 4985 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T10:10:37Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:37Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T10:10:37Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:37Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T10:10:37Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:37Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T10:10:37Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:37Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"6fa96d71-6980-4f45-b02a-32602082091f\\\",\\\"systemUUID\\\":\\\"77848161-2cfb-4f0a-8e60-0ac9426710fd\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:10:37Z is after 2025-08-24T17:21:41Z" Feb 24 10:10:37 crc kubenswrapper[4985]: I0224 10:10:37.695416 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:10:37 crc kubenswrapper[4985]: I0224 10:10:37.695637 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:10:37 crc kubenswrapper[4985]: I0224 10:10:37.695792 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:10:37 crc kubenswrapper[4985]: I0224 10:10:37.695967 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 10:10:37 crc kubenswrapper[4985]: I0224 10:10:37.696131 4985 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T10:10:37Z","lastTransitionTime":"2026-02-24T10:10:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 10:10:37 crc kubenswrapper[4985]: E0224 10:10:37.715436 4985 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T10:10:37Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:37Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T10:10:37Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:37Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T10:10:37Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:37Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T10:10:37Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:37Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"6fa96d71-6980-4f45-b02a-32602082091f\\\",\\\"systemUUID\\\":\\\"77848161-2cfb-4f0a-8e60-0ac9426710fd\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:10:37Z is after 2025-08-24T17:21:41Z" Feb 24 10:10:37 crc kubenswrapper[4985]: I0224 10:10:37.719568 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:10:37 crc kubenswrapper[4985]: I0224 10:10:37.719622 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:10:37 crc kubenswrapper[4985]: I0224 10:10:37.719665 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:10:37 crc kubenswrapper[4985]: I0224 10:10:37.719690 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 10:10:37 crc kubenswrapper[4985]: I0224 10:10:37.719708 4985 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T10:10:37Z","lastTransitionTime":"2026-02-24T10:10:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 10:10:37 crc kubenswrapper[4985]: E0224 10:10:37.733572 4985 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T10:10:37Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:37Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T10:10:37Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:37Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T10:10:37Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:37Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T10:10:37Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:37Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"6fa96d71-6980-4f45-b02a-32602082091f\\\",\\\"systemUUID\\\":\\\"77848161-2cfb-4f0a-8e60-0ac9426710fd\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:10:37Z is after 2025-08-24T17:21:41Z" Feb 24 10:10:37 crc kubenswrapper[4985]: I0224 10:10:37.737998 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:10:37 crc kubenswrapper[4985]: I0224 10:10:37.738079 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:10:37 crc kubenswrapper[4985]: I0224 10:10:37.738096 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:10:37 crc kubenswrapper[4985]: I0224 10:10:37.738122 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 10:10:37 crc kubenswrapper[4985]: I0224 10:10:37.738140 4985 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T10:10:37Z","lastTransitionTime":"2026-02-24T10:10:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 10:10:37 crc kubenswrapper[4985]: E0224 10:10:37.755077 4985 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T10:10:37Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:37Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T10:10:37Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:37Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T10:10:37Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:37Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T10:10:37Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:37Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"6fa96d71-6980-4f45-b02a-32602082091f\\\",\\\"systemUUID\\\":\\\"77848161-2cfb-4f0a-8e60-0ac9426710fd\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:10:37Z is after 2025-08-24T17:21:41Z" Feb 24 10:10:37 crc kubenswrapper[4985]: E0224 10:10:37.755330 4985 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Feb 24 10:10:39 crc kubenswrapper[4985]: I0224 10:10:39.264170 4985 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xkc65" Feb 24 10:10:39 crc kubenswrapper[4985]: I0224 10:10:39.264248 4985 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 24 10:10:39 crc kubenswrapper[4985]: I0224 10:10:39.264339 4985 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 24 10:10:39 crc kubenswrapper[4985]: E0224 10:10:39.264336 4985 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xkc65" podUID="d4340d1a-60cb-4240-87ba-1e468c9c41cf" Feb 24 10:10:39 crc kubenswrapper[4985]: I0224 10:10:39.264192 4985 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 24 10:10:39 crc kubenswrapper[4985]: E0224 10:10:39.264498 4985 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 24 10:10:39 crc kubenswrapper[4985]: E0224 10:10:39.264671 4985 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 24 10:10:39 crc kubenswrapper[4985]: E0224 10:10:39.264769 4985 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 24 10:10:40 crc kubenswrapper[4985]: I0224 10:10:40.265789 4985 scope.go:117] "RemoveContainer" containerID="374d551e69c9bf5e4a9f9cc851abb3d18629f675d0be6f68664926f579bb45fe" Feb 24 10:10:40 crc kubenswrapper[4985]: I0224 10:10:40.857198 4985 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-27dpt_1b3986ef-e9be-43db-9350-ccc7dd3f713f/ovnkube-controller/1.log" Feb 24 10:10:40 crc kubenswrapper[4985]: I0224 10:10:40.860226 4985 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-27dpt" event={"ID":"1b3986ef-e9be-43db-9350-ccc7dd3f713f","Type":"ContainerStarted","Data":"0aec8312b8a8ac9d698c0e1253f847faf52c391d69d8276e73b1e774a6819807"} Feb 24 10:10:40 crc kubenswrapper[4985]: I0224 10:10:40.860718 4985 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-27dpt" Feb 24 10:10:40 crc kubenswrapper[4985]: I0224 10:10:40.880328 4985 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-hq52w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"11c1c7b8-18df-4583-849f-76b62544344b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8546cf975b145528b7f94bd03c6570de0db1a059477da5795b7569250bde3ca5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:10:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kcfbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://14cb7316040a9cb3cb3236ed16ecc17eee99c6935a646b86604ea8285e1aab0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:10:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kcfbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:10:13Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-hq52w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:10:40Z is after 2025-08-24T17:21:41Z" Feb 24 10:10:40 crc kubenswrapper[4985]: I0224 10:10:40.891750 4985 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-jj7jq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4776cd42-9a5c-4c2e-a585-a1f4a49d2d6d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6f45d6400640ba9b59793abc9aee79bb1c32439ff377abdf8b4ec4d1cd7c336a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:10:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rvmf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:10:13Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-jj7jq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:10:40Z is after 2025-08-24T17:21:41Z" Feb 24 10:10:40 crc kubenswrapper[4985]: I0224 10:10:40.906513 4985 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a6395bdd-0d8e-4572-ae86-695e87aff12e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:08:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:08:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:08:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f9134e7cde3d6c701f928934ed4de91ee13bac8bc62dde5df6fe535a219fb57\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:08:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba53e3e1dfd92b0155e60254f21e53e06da9929560934d7bd95624f4a2bb619c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:08:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d059ea80117284da8c7eb82e4cd878eabba8f9d96102965708b9f6e0f8e96eb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:08:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dea2225b9a0112117f2f0bd6038ef636ac20e0cfcf3d608b720ad7aa2cacfa7c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6ee8b871ab25b79ee67f4e5735673469d19cfb336696170957d2439b0bca13af\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-24T10:09:32Z\\\",\\\"message\\\":\\\"le observer\\\\nW0224 10:09:31.800132 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0224 10:09:31.800281 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0224 10:09:31.801147 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1923306337/tls.crt::/tmp/serving-cert-1923306337/tls.key\\\\\\\"\\\\nI0224 10:09:31.990905 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0224 10:09:31.993717 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0224 10:09:31.993747 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0224 10:09:31.993768 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0224 10:09:31.993775 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0224 10:09:31.997392 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0224 10:09:31.997421 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0224 10:09:31.997427 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0224 10:09:31.997436 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nI0224 10:09:31.997435 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0224 10:09:31.997440 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0224 10:09:31.997467 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0224 10:09:31.997470 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0224 10:09:31.999573 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-24T10:09:31Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:10:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5cf6a4886c046added0a4380f1ca023def94c15e9d5c0ed472c714632684f158\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:08:39Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8da8e0d5478ae714a89c61cf4ca4142bc433bca4d4e4749f069c706abfed8c8f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8da8e0d5478ae714a89c61cf4ca4142bc433bca4d4e4749f069c706abfed8c8f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T10:08:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T10:08:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:08:36Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:10:40Z is after 2025-08-24T17:21:41Z" Feb 24 10:10:40 crc kubenswrapper[4985]: I0224 10:10:40.924935 4985 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-xj9h5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c4bc690c-38e6-488d-97b2-9bf37a916fe4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ab4787f21b4cef49e96547a1d1e7105c79d01d7863afd7ea013f9a38cb40dd06\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:10:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svj5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5aa381e3ea366641ef4fa423fdd46f8afb44b1b2a2ccb6c754007456149b75e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5aa381e3ea366641ef4fa423fdd46f8afb44b1b2a2ccb6c754007456149b75e6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T10:10:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svj5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e071fe863c4e9eb6ebecb3d58d1c4a3c66577cbe46568a4d80a5de5663636539\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e071fe863c4e9eb6ebecb3d58d1c4a3c66577cbe46568a4d80a5de5663636539\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T10:10:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T10:10:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svj5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ac924a3bb0825085e397dbc788641894675e9556fd8aeadcbf9a24a65e1a076\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4ac924a3bb0825085e397dbc788641894675e9556fd8aeadcbf9a24a65e1a076\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T10:10:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T10:10:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svj5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://91acce3dbd19205b8faf23208b7d65c0c47d8156eaf6d7d3357ce1ca64a3bb88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://91acce3dbd19205b8faf23208b7d65c0c47d8156eaf6d7d3357ce1ca64a3bb88\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T10:10:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T10:10:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svj5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://81482396eb82b6ef2f09d03cf5e6bc785209394a50c63b199581fc70120bb9fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://81482396eb82b6ef2f09d03cf5e6bc785209394a50c63b199581fc70120bb9fd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T10:10:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T10:10:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svj5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://556b84b0e9e72b0c26c58defa2c030943daffa83d97b60293cc5b5627a6c9174\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://556b84b0e9e72b0c26c58defa2c030943daffa83d97b60293cc5b5627a6c9174\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T10:10:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T10:10:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svj5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:10:13Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-xj9h5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:10:40Z is after 2025-08-24T17:21:41Z" Feb 24 10:10:40 crc kubenswrapper[4985]: I0224 10:10:40.940392 4985 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:10:40Z is after 2025-08-24T17:21:41Z" Feb 24 10:10:40 crc kubenswrapper[4985]: I0224 10:10:40.956243 4985 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://137c9b42aa2109e16d0e273e3e7d116db30ed716c1e6f78fb8f4e87409e3f252\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:10:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://40736fa6ee59064a7fc5bed9b08671d094e7a11b6a8bf4ec45220a0d62156542\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:10:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:10:40Z is after 2025-08-24T17:21:41Z" Feb 24 10:10:40 crc kubenswrapper[4985]: I0224 10:10:40.989228 4985 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"42276a15-2a73-4687-91ad-d5ff65f8526d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:08:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:08:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:08:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:08:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:08:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://239bae44bdfe47737a87a7c7ffe103f138acfb89e1eb16c3901b25e8a0fa12e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:08:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aeb360380a66a7d4923e8db278af3a2024d5ed789e85e1427aaad600c2754224\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:08:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7968a5b1a705405cdca66b1c914475bd58b8c511980dfe312abd7ddf447d6375\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:08:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://813f41fed199f25206b4e475a175beaa3f3223c8b4f946e533f4290f64dbd069\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:08:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2a1fe7d1c64e5170ec0ce50343eb2a81b002e5b8bb4160c5ab4abafcc4ece67b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:08:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f5db1ab98b50bb081b07ffa0a8660ef886047d941d9b8509f484011950d0a94a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f5db1ab98b50bb081b07ffa0a8660ef886047d941d9b8509f484011950d0a94a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T10:08:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T10:08:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9edf4a1146efaf26193d3f8c63eb95cd945e7be5d36dc825d24053417ac5b1ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9edf4a1146efaf26193d3f8c63eb95cd945e7be5d36dc825d24053417ac5b1ac\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T10:08:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T10:08:38Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://2592e50d0dc75b9c89a94486e27887c6487e21a72ad53055b1b9af0cdfd73ec9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2592e50d0dc75b9c89a94486e27887c6487e21a72ad53055b1b9af0cdfd73ec9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T10:08:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T10:08:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:08:36Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:10:40Z is after 2025-08-24T17:21:41Z" Feb 24 10:10:41 crc kubenswrapper[4985]: I0224 10:10:41.006758 4985 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c9e0873cd1521fe3cd14005f22118db4616043670f4b895435e71ddec9d82a0e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:10:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:10:41Z is after 2025-08-24T17:21:41Z" Feb 24 10:10:41 crc kubenswrapper[4985]: I0224 10:10:41.022363 4985 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:16Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:16Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e2f9d78f6a52c55d1f70fa119f84fdb11505d511d394a8ed61a298e79dfb2dcd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:10:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:10:41Z is after 2025-08-24T17:21:41Z" Feb 24 10:10:41 crc kubenswrapper[4985]: I0224 10:10:41.036525 4985 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-xkc65" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d4340d1a-60cb-4240-87ba-1e468c9c41cf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lvcfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lvcfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:10:13Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-xkc65\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:10:41Z is after 2025-08-24T17:21:41Z" Feb 24 10:10:41 crc kubenswrapper[4985]: I0224 10:10:41.061647 4985 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-27dpt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1b3986ef-e9be-43db-9350-ccc7dd3f713f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://97f91c2ddd00ceb58ea5691c7cb057bd7cfe8add5fd1dfd334f39b9def30db0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:10:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9n6c9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://af211a7b5dbd1e1f86f340d63afe03ad2f9691d166810837cc5988a24ad4f323\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:10:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9n6c9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3d76cc1e57df7809a67ce69b6690fd8cbe0ef9c364406b03bbb37bd110d65866\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:10:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9n6c9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cb7b5b1f27320e4bc00e48154c3bb41b1153e36c39a404cfc1d4a246f5909c5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:10:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9n6c9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e04b766ff2b4eb366ace7cbf978ef84bfa922494ec3d28f4cbd75efe1cb54f1e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:10:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9n6c9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7630d2e046fcee78924010a8de1aefcdf7c24ef4e70c4ea89c193983a0d88a61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:10:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9n6c9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0aec8312b8a8ac9d698c0e1253f847faf52c391d69d8276e73b1e774a6819807\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://374d551e69c9bf5e4a9f9cc851abb3d18629f675d0be6f68664926f579bb45fe\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-24T10:10:24Z\\\",\\\"message\\\":\\\"772757 6847 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0224 10:10:24.772762 6847 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0224 10:10:24.772771 6847 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0224 10:10:24.772784 6847 factory.go:656] Stopping watch factory\\\\nI0224 10:10:24.772785 6847 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0224 10:10:24.772793 6847 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI0224 10:10:24.772811 6847 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0224 10:10:24.772881 6847 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI0224 10:10:24.772946 6847 ovnkube.go:599] Stopped ovnkube\\\\nI0224 10:10:24.773029 6847 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI0224 10:10:24.773081 6847 loadbalancer.go:304] Deleted 0 stale LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-kube-scheduler/scheduler\\\\\\\"}\\\\nF0224 10:10:24.773103 6847 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: failed to add event handler: handler {0x1e60340 0x1e60020 0x1e5ffc0} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network contr\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-24T10:10:23Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:10:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9n6c9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e3faa6c1361aca1d102b6dbfaf38fa3094a5c2334713ce70a8bf2f58f19d6392\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:10:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9n6c9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8de70126733b3634f4e3c1576c3095c089e9628a6147ce8ea79fa5242a2eb920\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8de70126733b3634f4e3c1576c3095c089e9628a6147ce8ea79fa5242a2eb920\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T10:10:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9n6c9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:10:13Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-27dpt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:10:41Z is after 2025-08-24T17:21:41Z" Feb 24 10:10:41 crc kubenswrapper[4985]: I0224 10:10:41.075466 4985 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-qtqms" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c43e3252-1b22-48a0-8895-ad98fc88b7cf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://778506fd4d9af7b6ebd4e4d46282938146ba0184d9747ca657957c78118c10e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:10:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jdldr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:10:13Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-qtqms\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:10:41Z is after 2025-08-24T17:21:41Z" Feb 24 10:10:41 crc kubenswrapper[4985]: I0224 10:10:41.094950 4985 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:10:41Z is after 2025-08-24T17:21:41Z" Feb 24 10:10:41 crc kubenswrapper[4985]: I0224 10:10:41.112690 4985 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-q24bf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"731349d2-7b07-4bc9-81f8-c7d75bca842a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://85e965ad144b3440864b3c9de70aacd2a2e49c715dca204b0e4cfae65954cd9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:10:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5g9sh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:10:13Z\\\"}}\" for pod \"openshift-multus\"/\"multus-q24bf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:10:41Z is after 2025-08-24T17:21:41Z" Feb 24 10:10:41 crc kubenswrapper[4985]: I0224 10:10:41.133408 4985 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-kkmm8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c3a34c00-910b-400b-bc96-9d805e076b7c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://65682d7b02ccbc0c344ab0b8eaf55d227d0b6e4a04e56d1038d51aa30747b068\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:10:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2zg8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://52f2dbaf1dfa5fd40bbf8a0d82613a96b05afccc71653f6d1f24db6674950b58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:10:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2zg8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:10:13Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-kkmm8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:10:41Z is after 2025-08-24T17:21:41Z" Feb 24 10:10:41 crc kubenswrapper[4985]: I0224 10:10:41.149910 4985 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:10:41Z is after 2025-08-24T17:21:41Z" Feb 24 10:10:41 crc kubenswrapper[4985]: I0224 10:10:41.264441 4985 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 24 10:10:41 crc kubenswrapper[4985]: I0224 10:10:41.264529 4985 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 24 10:10:41 crc kubenswrapper[4985]: I0224 10:10:41.264591 4985 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xkc65" Feb 24 10:10:41 crc kubenswrapper[4985]: E0224 10:10:41.264725 4985 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 24 10:10:41 crc kubenswrapper[4985]: I0224 10:10:41.265226 4985 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 24 10:10:41 crc kubenswrapper[4985]: E0224 10:10:41.265315 4985 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xkc65" podUID="d4340d1a-60cb-4240-87ba-1e468c9c41cf" Feb 24 10:10:41 crc kubenswrapper[4985]: E0224 10:10:41.265618 4985 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 24 10:10:41 crc kubenswrapper[4985]: E0224 10:10:41.265716 4985 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 24 10:10:41 crc kubenswrapper[4985]: E0224 10:10:41.362940 4985 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 24 10:10:41 crc kubenswrapper[4985]: I0224 10:10:41.864641 4985 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-27dpt_1b3986ef-e9be-43db-9350-ccc7dd3f713f/ovnkube-controller/2.log" Feb 24 10:10:41 crc kubenswrapper[4985]: I0224 10:10:41.865161 4985 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-27dpt_1b3986ef-e9be-43db-9350-ccc7dd3f713f/ovnkube-controller/1.log" Feb 24 10:10:41 crc kubenswrapper[4985]: I0224 10:10:41.867017 4985 generic.go:334] "Generic (PLEG): container finished" podID="1b3986ef-e9be-43db-9350-ccc7dd3f713f" containerID="0aec8312b8a8ac9d698c0e1253f847faf52c391d69d8276e73b1e774a6819807" exitCode=1 Feb 24 10:10:41 crc kubenswrapper[4985]: I0224 10:10:41.867056 4985 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-27dpt" event={"ID":"1b3986ef-e9be-43db-9350-ccc7dd3f713f","Type":"ContainerDied","Data":"0aec8312b8a8ac9d698c0e1253f847faf52c391d69d8276e73b1e774a6819807"} Feb 24 10:10:41 crc kubenswrapper[4985]: I0224 10:10:41.867090 4985 scope.go:117] "RemoveContainer" containerID="374d551e69c9bf5e4a9f9cc851abb3d18629f675d0be6f68664926f579bb45fe" Feb 24 10:10:41 crc kubenswrapper[4985]: I0224 10:10:41.867665 4985 scope.go:117] "RemoveContainer" containerID="0aec8312b8a8ac9d698c0e1253f847faf52c391d69d8276e73b1e774a6819807" Feb 24 10:10:41 crc kubenswrapper[4985]: E0224 10:10:41.867794 4985 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-27dpt_openshift-ovn-kubernetes(1b3986ef-e9be-43db-9350-ccc7dd3f713f)\"" pod="openshift-ovn-kubernetes/ovnkube-node-27dpt" podUID="1b3986ef-e9be-43db-9350-ccc7dd3f713f" Feb 24 10:10:41 crc kubenswrapper[4985]: I0224 10:10:41.883636 4985 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-qtqms" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c43e3252-1b22-48a0-8895-ad98fc88b7cf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://778506fd4d9af7b6ebd4e4d46282938146ba0184d9747ca657957c78118c10e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:10:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jdldr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:10:13Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-qtqms\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:10:41Z is after 2025-08-24T17:21:41Z" Feb 24 10:10:41 crc kubenswrapper[4985]: I0224 10:10:41.911417 4985 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"42276a15-2a73-4687-91ad-d5ff65f8526d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:08:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:08:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:08:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:08:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:08:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://239bae44bdfe47737a87a7c7ffe103f138acfb89e1eb16c3901b25e8a0fa12e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:08:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aeb360380a66a7d4923e8db278af3a2024d5ed789e85e1427aaad600c2754224\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:08:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7968a5b1a705405cdca66b1c914475bd58b8c511980dfe312abd7ddf447d6375\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:08:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://813f41fed199f25206b4e475a175beaa3f3223c8b4f946e533f4290f64dbd069\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:08:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2a1fe7d1c64e5170ec0ce50343eb2a81b002e5b8bb4160c5ab4abafcc4ece67b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:08:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f5db1ab98b50bb081b07ffa0a8660ef886047d941d9b8509f484011950d0a94a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f5db1ab98b50bb081b07ffa0a8660ef886047d941d9b8509f484011950d0a94a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T10:08:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T10:08:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9edf4a1146efaf26193d3f8c63eb95cd945e7be5d36dc825d24053417ac5b1ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9edf4a1146efaf26193d3f8c63eb95cd945e7be5d36dc825d24053417ac5b1ac\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T10:08:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T10:08:38Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://2592e50d0dc75b9c89a94486e27887c6487e21a72ad53055b1b9af0cdfd73ec9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2592e50d0dc75b9c89a94486e27887c6487e21a72ad53055b1b9af0cdfd73ec9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T10:08:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T10:08:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:08:36Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:10:41Z is after 2025-08-24T17:21:41Z" Feb 24 10:10:41 crc kubenswrapper[4985]: I0224 10:10:41.932469 4985 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c9e0873cd1521fe3cd14005f22118db4616043670f4b895435e71ddec9d82a0e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:10:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:10:41Z is after 2025-08-24T17:21:41Z" Feb 24 10:10:41 crc kubenswrapper[4985]: I0224 10:10:41.947547 4985 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:16Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:16Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e2f9d78f6a52c55d1f70fa119f84fdb11505d511d394a8ed61a298e79dfb2dcd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:10:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:10:41Z is after 2025-08-24T17:21:41Z" Feb 24 10:10:41 crc kubenswrapper[4985]: I0224 10:10:41.962733 4985 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-xkc65" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d4340d1a-60cb-4240-87ba-1e468c9c41cf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lvcfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lvcfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:10:13Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-xkc65\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:10:41Z is after 2025-08-24T17:21:41Z" Feb 24 10:10:41 crc kubenswrapper[4985]: I0224 10:10:41.983471 4985 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-27dpt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1b3986ef-e9be-43db-9350-ccc7dd3f713f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://97f91c2ddd00ceb58ea5691c7cb057bd7cfe8add5fd1dfd334f39b9def30db0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:10:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9n6c9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://af211a7b5dbd1e1f86f340d63afe03ad2f9691d166810837cc5988a24ad4f323\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:10:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9n6c9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3d76cc1e57df7809a67ce69b6690fd8cbe0ef9c364406b03bbb37bd110d65866\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:10:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9n6c9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cb7b5b1f27320e4bc00e48154c3bb41b1153e36c39a404cfc1d4a246f5909c5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:10:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9n6c9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e04b766ff2b4eb366ace7cbf978ef84bfa922494ec3d28f4cbd75efe1cb54f1e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:10:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9n6c9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7630d2e046fcee78924010a8de1aefcdf7c24ef4e70c4ea89c193983a0d88a61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:10:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9n6c9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0aec8312b8a8ac9d698c0e1253f847faf52c391d69d8276e73b1e774a6819807\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://374d551e69c9bf5e4a9f9cc851abb3d18629f675d0be6f68664926f579bb45fe\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-24T10:10:24Z\\\",\\\"message\\\":\\\"772757 6847 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0224 10:10:24.772762 6847 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0224 10:10:24.772771 6847 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0224 10:10:24.772784 6847 factory.go:656] Stopping watch factory\\\\nI0224 10:10:24.772785 6847 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0224 10:10:24.772793 6847 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI0224 10:10:24.772811 6847 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0224 10:10:24.772881 6847 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI0224 10:10:24.772946 6847 ovnkube.go:599] Stopped ovnkube\\\\nI0224 10:10:24.773029 6847 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI0224 10:10:24.773081 6847 loadbalancer.go:304] Deleted 0 stale LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-kube-scheduler/scheduler\\\\\\\"}\\\\nF0224 10:10:24.773103 6847 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: failed to add event handler: handler {0x1e60340 0x1e60020 0x1e5ffc0} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network contr\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-24T10:10:23Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0aec8312b8a8ac9d698c0e1253f847faf52c391d69d8276e73b1e774a6819807\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-24T10:10:41Z\\\",\\\"message\\\":\\\"ft-etcd/etcd-crc after 0 failed attempt(s)\\\\nI0224 10:10:41.298714 7041 default_network_controller.go:776] Recording success event on pod openshift-etcd/etcd-crc\\\\nI0224 10:10:41.298592 7041 ovnkube.go:599] Stopped ovnkube\\\\nI0224 10:10:41.298707 7041 transact.go:42] Configuring OVN: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-marketplace/certified-operators]} name:Service_openshift-marketplace/certified-operators_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.214:50051:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {20da2226-531c-4179-9810-aa4026995ca3}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0224 10:10:41.298743 7041 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI0224 10:10:41.298781 7041 loadbalancer.go:304] Deleted 0 stale LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-kube-controller-manager/kube-controller-manager\\\\\\\"}\\\\nF0224 10:10:41.298810 7041 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-24T10:10:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9n6c9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e3faa6c1361aca1d102b6dbfaf38fa3094a5c2334713ce70a8bf2f58f19d6392\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:10:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9n6c9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8de70126733b3634f4e3c1576c3095c089e9628a6147ce8ea79fa5242a2eb920\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8de70126733b3634f4e3c1576c3095c089e9628a6147ce8ea79fa5242a2eb920\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T10:10:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9n6c9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:10:13Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-27dpt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:10:41Z is after 2025-08-24T17:21:41Z" Feb 24 10:10:41 crc kubenswrapper[4985]: I0224 10:10:41.997556 4985 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:10:41Z is after 2025-08-24T17:21:41Z" Feb 24 10:10:42 crc kubenswrapper[4985]: I0224 10:10:42.011084 4985 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:10:42Z is after 2025-08-24T17:21:41Z" Feb 24 10:10:42 crc kubenswrapper[4985]: I0224 10:10:42.027508 4985 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-q24bf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"731349d2-7b07-4bc9-81f8-c7d75bca842a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://85e965ad144b3440864b3c9de70aacd2a2e49c715dca204b0e4cfae65954cd9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:10:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5g9sh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:10:13Z\\\"}}\" for pod \"openshift-multus\"/\"multus-q24bf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:10:42Z is after 2025-08-24T17:21:41Z" Feb 24 10:10:42 crc kubenswrapper[4985]: I0224 10:10:42.039811 4985 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-kkmm8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c3a34c00-910b-400b-bc96-9d805e076b7c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://65682d7b02ccbc0c344ab0b8eaf55d227d0b6e4a04e56d1038d51aa30747b068\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:10:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2zg8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://52f2dbaf1dfa5fd40bbf8a0d82613a96b05afccc71653f6d1f24db6674950b58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:10:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2zg8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:10:13Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-kkmm8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:10:42Z is after 2025-08-24T17:21:41Z" Feb 24 10:10:42 crc kubenswrapper[4985]: I0224 10:10:42.058454 4985 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a6395bdd-0d8e-4572-ae86-695e87aff12e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:08:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:08:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:08:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f9134e7cde3d6c701f928934ed4de91ee13bac8bc62dde5df6fe535a219fb57\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:08:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba53e3e1dfd92b0155e60254f21e53e06da9929560934d7bd95624f4a2bb619c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:08:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d059ea80117284da8c7eb82e4cd878eabba8f9d96102965708b9f6e0f8e96eb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:08:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dea2225b9a0112117f2f0bd6038ef636ac20e0cfcf3d608b720ad7aa2cacfa7c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6ee8b871ab25b79ee67f4e5735673469d19cfb336696170957d2439b0bca13af\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-24T10:09:32Z\\\",\\\"message\\\":\\\"le observer\\\\nW0224 10:09:31.800132 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0224 10:09:31.800281 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0224 10:09:31.801147 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1923306337/tls.crt::/tmp/serving-cert-1923306337/tls.key\\\\\\\"\\\\nI0224 10:09:31.990905 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0224 10:09:31.993717 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0224 10:09:31.993747 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0224 10:09:31.993768 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0224 10:09:31.993775 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0224 10:09:31.997392 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0224 10:09:31.997421 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0224 10:09:31.997427 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0224 10:09:31.997436 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nI0224 10:09:31.997435 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0224 10:09:31.997440 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0224 10:09:31.997467 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0224 10:09:31.997470 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0224 10:09:31.999573 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-24T10:09:31Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:10:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5cf6a4886c046added0a4380f1ca023def94c15e9d5c0ed472c714632684f158\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:08:39Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8da8e0d5478ae714a89c61cf4ca4142bc433bca4d4e4749f069c706abfed8c8f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8da8e0d5478ae714a89c61cf4ca4142bc433bca4d4e4749f069c706abfed8c8f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T10:08:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T10:08:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:08:36Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:10:42Z is after 2025-08-24T17:21:41Z" Feb 24 10:10:42 crc kubenswrapper[4985]: I0224 10:10:42.083734 4985 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-xj9h5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c4bc690c-38e6-488d-97b2-9bf37a916fe4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ab4787f21b4cef49e96547a1d1e7105c79d01d7863afd7ea013f9a38cb40dd06\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:10:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svj5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5aa381e3ea366641ef4fa423fdd46f8afb44b1b2a2ccb6c754007456149b75e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5aa381e3ea366641ef4fa423fdd46f8afb44b1b2a2ccb6c754007456149b75e6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T10:10:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svj5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e071fe863c4e9eb6ebecb3d58d1c4a3c66577cbe46568a4d80a5de5663636539\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e071fe863c4e9eb6ebecb3d58d1c4a3c66577cbe46568a4d80a5de5663636539\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T10:10:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T10:10:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svj5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ac924a3bb0825085e397dbc788641894675e9556fd8aeadcbf9a24a65e1a076\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4ac924a3bb0825085e397dbc788641894675e9556fd8aeadcbf9a24a65e1a076\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T10:10:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T10:10:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svj5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://91acce3dbd19205b8faf23208b7d65c0c47d8156eaf6d7d3357ce1ca64a3bb88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://91acce3dbd19205b8faf23208b7d65c0c47d8156eaf6d7d3357ce1ca64a3bb88\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T10:10:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T10:10:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svj5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://81482396eb82b6ef2f09d03cf5e6bc785209394a50c63b199581fc70120bb9fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://81482396eb82b6ef2f09d03cf5e6bc785209394a50c63b199581fc70120bb9fd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T10:10:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T10:10:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svj5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://556b84b0e9e72b0c26c58defa2c030943daffa83d97b60293cc5b5627a6c9174\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://556b84b0e9e72b0c26c58defa2c030943daffa83d97b60293cc5b5627a6c9174\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T10:10:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T10:10:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svj5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:10:13Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-xj9h5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:10:42Z is after 2025-08-24T17:21:41Z" Feb 24 10:10:42 crc kubenswrapper[4985]: I0224 10:10:42.097402 4985 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-hq52w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"11c1c7b8-18df-4583-849f-76b62544344b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8546cf975b145528b7f94bd03c6570de0db1a059477da5795b7569250bde3ca5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:10:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kcfbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://14cb7316040a9cb3cb3236ed16ecc17eee99c6935a646b86604ea8285e1aab0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:10:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kcfbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:10:13Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-hq52w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:10:42Z is after 2025-08-24T17:21:41Z" Feb 24 10:10:42 crc kubenswrapper[4985]: I0224 10:10:42.112600 4985 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-jj7jq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4776cd42-9a5c-4c2e-a585-a1f4a49d2d6d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6f45d6400640ba9b59793abc9aee79bb1c32439ff377abdf8b4ec4d1cd7c336a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:10:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rvmf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:10:13Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-jj7jq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:10:42Z is after 2025-08-24T17:21:41Z" Feb 24 10:10:42 crc kubenswrapper[4985]: I0224 10:10:42.129852 4985 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:10:42Z is after 2025-08-24T17:21:41Z" Feb 24 10:10:42 crc kubenswrapper[4985]: I0224 10:10:42.149153 4985 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://137c9b42aa2109e16d0e273e3e7d116db30ed716c1e6f78fb8f4e87409e3f252\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:10:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://40736fa6ee59064a7fc5bed9b08671d094e7a11b6a8bf4ec45220a0d62156542\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:10:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:10:42Z is after 2025-08-24T17:21:41Z" Feb 24 10:10:42 crc kubenswrapper[4985]: I0224 10:10:42.873102 4985 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-27dpt_1b3986ef-e9be-43db-9350-ccc7dd3f713f/ovnkube-controller/2.log" Feb 24 10:10:42 crc kubenswrapper[4985]: I0224 10:10:42.878941 4985 scope.go:117] "RemoveContainer" containerID="0aec8312b8a8ac9d698c0e1253f847faf52c391d69d8276e73b1e774a6819807" Feb 24 10:10:42 crc kubenswrapper[4985]: E0224 10:10:42.879276 4985 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-27dpt_openshift-ovn-kubernetes(1b3986ef-e9be-43db-9350-ccc7dd3f713f)\"" pod="openshift-ovn-kubernetes/ovnkube-node-27dpt" podUID="1b3986ef-e9be-43db-9350-ccc7dd3f713f" Feb 24 10:10:42 crc kubenswrapper[4985]: I0224 10:10:42.890274 4985 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-jj7jq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4776cd42-9a5c-4c2e-a585-a1f4a49d2d6d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6f45d6400640ba9b59793abc9aee79bb1c32439ff377abdf8b4ec4d1cd7c336a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:10:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rvmf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:10:13Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-jj7jq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:10:42Z is after 2025-08-24T17:21:41Z" Feb 24 10:10:42 crc kubenswrapper[4985]: I0224 10:10:42.906734 4985 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a6395bdd-0d8e-4572-ae86-695e87aff12e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:08:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:08:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:08:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f9134e7cde3d6c701f928934ed4de91ee13bac8bc62dde5df6fe535a219fb57\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:08:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba53e3e1dfd92b0155e60254f21e53e06da9929560934d7bd95624f4a2bb619c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:08:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d059ea80117284da8c7eb82e4cd878eabba8f9d96102965708b9f6e0f8e96eb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:08:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dea2225b9a0112117f2f0bd6038ef636ac20e0cfcf3d608b720ad7aa2cacfa7c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6ee8b871ab25b79ee67f4e5735673469d19cfb336696170957d2439b0bca13af\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-24T10:09:32Z\\\",\\\"message\\\":\\\"le observer\\\\nW0224 10:09:31.800132 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0224 10:09:31.800281 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0224 10:09:31.801147 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1923306337/tls.crt::/tmp/serving-cert-1923306337/tls.key\\\\\\\"\\\\nI0224 10:09:31.990905 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0224 10:09:31.993717 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0224 10:09:31.993747 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0224 10:09:31.993768 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0224 10:09:31.993775 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0224 10:09:31.997392 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0224 10:09:31.997421 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0224 10:09:31.997427 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0224 10:09:31.997436 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nI0224 10:09:31.997435 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0224 10:09:31.997440 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0224 10:09:31.997467 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0224 10:09:31.997470 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0224 10:09:31.999573 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-24T10:09:31Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:10:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5cf6a4886c046added0a4380f1ca023def94c15e9d5c0ed472c714632684f158\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:08:39Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8da8e0d5478ae714a89c61cf4ca4142bc433bca4d4e4749f069c706abfed8c8f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8da8e0d5478ae714a89c61cf4ca4142bc433bca4d4e4749f069c706abfed8c8f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T10:08:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T10:08:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:08:36Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:10:42Z is after 2025-08-24T17:21:41Z" Feb 24 10:10:42 crc kubenswrapper[4985]: I0224 10:10:42.930757 4985 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-xj9h5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c4bc690c-38e6-488d-97b2-9bf37a916fe4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ab4787f21b4cef49e96547a1d1e7105c79d01d7863afd7ea013f9a38cb40dd06\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:10:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svj5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5aa381e3ea366641ef4fa423fdd46f8afb44b1b2a2ccb6c754007456149b75e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5aa381e3ea366641ef4fa423fdd46f8afb44b1b2a2ccb6c754007456149b75e6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T10:10:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svj5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e071fe863c4e9eb6ebecb3d58d1c4a3c66577cbe46568a4d80a5de5663636539\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e071fe863c4e9eb6ebecb3d58d1c4a3c66577cbe46568a4d80a5de5663636539\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T10:10:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T10:10:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svj5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ac924a3bb0825085e397dbc788641894675e9556fd8aeadcbf9a24a65e1a076\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4ac924a3bb0825085e397dbc788641894675e9556fd8aeadcbf9a24a65e1a076\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T10:10:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T10:10:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svj5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://91acce3dbd19205b8faf23208b7d65c0c47d8156eaf6d7d3357ce1ca64a3bb88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://91acce3dbd19205b8faf23208b7d65c0c47d8156eaf6d7d3357ce1ca64a3bb88\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T10:10:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T10:10:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svj5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://81482396eb82b6ef2f09d03cf5e6bc785209394a50c63b199581fc70120bb9fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://81482396eb82b6ef2f09d03cf5e6bc785209394a50c63b199581fc70120bb9fd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T10:10:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T10:10:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svj5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://556b84b0e9e72b0c26c58defa2c030943daffa83d97b60293cc5b5627a6c9174\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://556b84b0e9e72b0c26c58defa2c030943daffa83d97b60293cc5b5627a6c9174\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T10:10:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T10:10:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svj5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:10:13Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-xj9h5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:10:42Z is after 2025-08-24T17:21:41Z" Feb 24 10:10:42 crc kubenswrapper[4985]: I0224 10:10:42.948228 4985 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-hq52w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"11c1c7b8-18df-4583-849f-76b62544344b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8546cf975b145528b7f94bd03c6570de0db1a059477da5795b7569250bde3ca5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:10:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kcfbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://14cb7316040a9cb3cb3236ed16ecc17eee99c6935a646b86604ea8285e1aab0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:10:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kcfbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:10:13Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-hq52w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:10:42Z is after 2025-08-24T17:21:41Z" Feb 24 10:10:42 crc kubenswrapper[4985]: I0224 10:10:42.966925 4985 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:10:42Z is after 2025-08-24T17:21:41Z" Feb 24 10:10:42 crc kubenswrapper[4985]: I0224 10:10:42.984032 4985 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://137c9b42aa2109e16d0e273e3e7d116db30ed716c1e6f78fb8f4e87409e3f252\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:10:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://40736fa6ee59064a7fc5bed9b08671d094e7a11b6a8bf4ec45220a0d62156542\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:10:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:10:42Z is after 2025-08-24T17:21:41Z" Feb 24 10:10:43 crc kubenswrapper[4985]: I0224 10:10:43.001424 4985 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c9e0873cd1521fe3cd14005f22118db4616043670f4b895435e71ddec9d82a0e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:10:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:10:42Z is after 2025-08-24T17:21:41Z" Feb 24 10:10:43 crc kubenswrapper[4985]: I0224 10:10:43.018510 4985 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:16Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:16Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e2f9d78f6a52c55d1f70fa119f84fdb11505d511d394a8ed61a298e79dfb2dcd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:10:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:10:43Z is after 2025-08-24T17:21:41Z" Feb 24 10:10:43 crc kubenswrapper[4985]: I0224 10:10:43.031627 4985 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-xkc65" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d4340d1a-60cb-4240-87ba-1e468c9c41cf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lvcfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lvcfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:10:13Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-xkc65\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:10:43Z is after 2025-08-24T17:21:41Z" Feb 24 10:10:43 crc kubenswrapper[4985]: I0224 10:10:43.056122 4985 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-27dpt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1b3986ef-e9be-43db-9350-ccc7dd3f713f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://97f91c2ddd00ceb58ea5691c7cb057bd7cfe8add5fd1dfd334f39b9def30db0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:10:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9n6c9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://af211a7b5dbd1e1f86f340d63afe03ad2f9691d166810837cc5988a24ad4f323\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:10:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9n6c9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3d76cc1e57df7809a67ce69b6690fd8cbe0ef9c364406b03bbb37bd110d65866\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:10:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9n6c9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cb7b5b1f27320e4bc00e48154c3bb41b1153e36c39a404cfc1d4a246f5909c5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:10:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9n6c9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e04b766ff2b4eb366ace7cbf978ef84bfa922494ec3d28f4cbd75efe1cb54f1e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:10:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9n6c9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7630d2e046fcee78924010a8de1aefcdf7c24ef4e70c4ea89c193983a0d88a61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:10:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9n6c9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0aec8312b8a8ac9d698c0e1253f847faf52c391d69d8276e73b1e774a6819807\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0aec8312b8a8ac9d698c0e1253f847faf52c391d69d8276e73b1e774a6819807\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-24T10:10:41Z\\\",\\\"message\\\":\\\"ft-etcd/etcd-crc after 0 failed attempt(s)\\\\nI0224 10:10:41.298714 7041 default_network_controller.go:776] Recording success event on pod openshift-etcd/etcd-crc\\\\nI0224 10:10:41.298592 7041 ovnkube.go:599] Stopped ovnkube\\\\nI0224 10:10:41.298707 7041 transact.go:42] Configuring OVN: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-marketplace/certified-operators]} name:Service_openshift-marketplace/certified-operators_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.214:50051:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {20da2226-531c-4179-9810-aa4026995ca3}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0224 10:10:41.298743 7041 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI0224 10:10:41.298781 7041 loadbalancer.go:304] Deleted 0 stale LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-kube-controller-manager/kube-controller-manager\\\\\\\"}\\\\nF0224 10:10:41.298810 7041 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-24T10:10:40Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-27dpt_openshift-ovn-kubernetes(1b3986ef-e9be-43db-9350-ccc7dd3f713f)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9n6c9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e3faa6c1361aca1d102b6dbfaf38fa3094a5c2334713ce70a8bf2f58f19d6392\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:10:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9n6c9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8de70126733b3634f4e3c1576c3095c089e9628a6147ce8ea79fa5242a2eb920\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8de70126733b3634f4e3c1576c3095c089e9628a6147ce8ea79fa5242a2eb920\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T10:10:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9n6c9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:10:13Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-27dpt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:10:43Z is after 2025-08-24T17:21:41Z" Feb 24 10:10:43 crc kubenswrapper[4985]: I0224 10:10:43.073233 4985 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-qtqms" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c43e3252-1b22-48a0-8895-ad98fc88b7cf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://778506fd4d9af7b6ebd4e4d46282938146ba0184d9747ca657957c78118c10e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:10:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jdldr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:10:13Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-qtqms\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:10:43Z is after 2025-08-24T17:21:41Z" Feb 24 10:10:43 crc kubenswrapper[4985]: I0224 10:10:43.097930 4985 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"42276a15-2a73-4687-91ad-d5ff65f8526d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:08:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:08:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:08:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:08:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:08:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://239bae44bdfe47737a87a7c7ffe103f138acfb89e1eb16c3901b25e8a0fa12e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:08:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aeb360380a66a7d4923e8db278af3a2024d5ed789e85e1427aaad600c2754224\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:08:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7968a5b1a705405cdca66b1c914475bd58b8c511980dfe312abd7ddf447d6375\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:08:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://813f41fed199f25206b4e475a175beaa3f3223c8b4f946e533f4290f64dbd069\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:08:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2a1fe7d1c64e5170ec0ce50343eb2a81b002e5b8bb4160c5ab4abafcc4ece67b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:08:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f5db1ab98b50bb081b07ffa0a8660ef886047d941d9b8509f484011950d0a94a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f5db1ab98b50bb081b07ffa0a8660ef886047d941d9b8509f484011950d0a94a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T10:08:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T10:08:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9edf4a1146efaf26193d3f8c63eb95cd945e7be5d36dc825d24053417ac5b1ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9edf4a1146efaf26193d3f8c63eb95cd945e7be5d36dc825d24053417ac5b1ac\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T10:08:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T10:08:38Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://2592e50d0dc75b9c89a94486e27887c6487e21a72ad53055b1b9af0cdfd73ec9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2592e50d0dc75b9c89a94486e27887c6487e21a72ad53055b1b9af0cdfd73ec9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T10:08:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T10:08:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:08:36Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:10:43Z is after 2025-08-24T17:21:41Z" Feb 24 10:10:43 crc kubenswrapper[4985]: I0224 10:10:43.114185 4985 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-q24bf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"731349d2-7b07-4bc9-81f8-c7d75bca842a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://85e965ad144b3440864b3c9de70aacd2a2e49c715dca204b0e4cfae65954cd9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:10:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5g9sh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:10:13Z\\\"}}\" for pod \"openshift-multus\"/\"multus-q24bf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:10:43Z is after 2025-08-24T17:21:41Z" Feb 24 10:10:43 crc kubenswrapper[4985]: I0224 10:10:43.129774 4985 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-kkmm8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c3a34c00-910b-400b-bc96-9d805e076b7c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://65682d7b02ccbc0c344ab0b8eaf55d227d0b6e4a04e56d1038d51aa30747b068\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:10:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2zg8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://52f2dbaf1dfa5fd40bbf8a0d82613a96b05afccc71653f6d1f24db6674950b58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:10:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2zg8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:10:13Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-kkmm8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:10:43Z is after 2025-08-24T17:21:41Z" Feb 24 10:10:43 crc kubenswrapper[4985]: I0224 10:10:43.145287 4985 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:10:43Z is after 2025-08-24T17:21:41Z" Feb 24 10:10:43 crc kubenswrapper[4985]: I0224 10:10:43.162687 4985 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:10:43Z is after 2025-08-24T17:21:41Z" Feb 24 10:10:43 crc kubenswrapper[4985]: I0224 10:10:43.264749 4985 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 24 10:10:43 crc kubenswrapper[4985]: I0224 10:10:43.264814 4985 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xkc65" Feb 24 10:10:43 crc kubenswrapper[4985]: I0224 10:10:43.264833 4985 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 24 10:10:43 crc kubenswrapper[4985]: E0224 10:10:43.265714 4985 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 24 10:10:43 crc kubenswrapper[4985]: E0224 10:10:43.265773 4985 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xkc65" podUID="d4340d1a-60cb-4240-87ba-1e468c9c41cf" Feb 24 10:10:43 crc kubenswrapper[4985]: I0224 10:10:43.264908 4985 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 24 10:10:43 crc kubenswrapper[4985]: E0224 10:10:43.265908 4985 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 24 10:10:43 crc kubenswrapper[4985]: E0224 10:10:43.265950 4985 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 24 10:10:44 crc kubenswrapper[4985]: I0224 10:10:44.275138 4985 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler/openshift-kube-scheduler-crc"] Feb 24 10:10:44 crc kubenswrapper[4985]: I0224 10:10:44.275272 4985 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/kube-controller-manager-crc"] Feb 24 10:10:45 crc kubenswrapper[4985]: I0224 10:10:45.155008 4985 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 24 10:10:45 crc kubenswrapper[4985]: I0224 10:10:45.155151 4985 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 24 10:10:45 crc kubenswrapper[4985]: E0224 10:10:45.155174 4985 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-24 10:11:17.155155565 +0000 UTC m=+161.629348125 (durationBeforeRetry 32s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 24 10:10:45 crc kubenswrapper[4985]: I0224 10:10:45.155250 4985 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 24 10:10:45 crc kubenswrapper[4985]: E0224 10:10:45.155329 4985 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 24 10:10:45 crc kubenswrapper[4985]: E0224 10:10:45.155368 4985 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-24 10:11:17.155359601 +0000 UTC m=+161.629552161 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 24 10:10:45 crc kubenswrapper[4985]: E0224 10:10:45.155420 4985 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Feb 24 10:10:45 crc kubenswrapper[4985]: E0224 10:10:45.155514 4985 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-24 10:11:17.155491044 +0000 UTC m=+161.629683664 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Feb 24 10:10:45 crc kubenswrapper[4985]: I0224 10:10:45.155507 4985 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 24 10:10:45 crc kubenswrapper[4985]: E0224 10:10:45.155651 4985 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 24 10:10:45 crc kubenswrapper[4985]: E0224 10:10:45.155694 4985 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 24 10:10:45 crc kubenswrapper[4985]: E0224 10:10:45.155715 4985 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 24 10:10:45 crc kubenswrapper[4985]: E0224 10:10:45.155779 4985 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-02-24 10:11:17.155760452 +0000 UTC m=+161.629953052 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 24 10:10:45 crc kubenswrapper[4985]: I0224 10:10:45.256744 4985 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d4340d1a-60cb-4240-87ba-1e468c9c41cf-metrics-certs\") pod \"network-metrics-daemon-xkc65\" (UID: \"d4340d1a-60cb-4240-87ba-1e468c9c41cf\") " pod="openshift-multus/network-metrics-daemon-xkc65" Feb 24 10:10:45 crc kubenswrapper[4985]: I0224 10:10:45.256819 4985 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 24 10:10:45 crc kubenswrapper[4985]: E0224 10:10:45.256966 4985 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 24 10:10:45 crc kubenswrapper[4985]: E0224 10:10:45.256986 4985 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 24 10:10:45 crc kubenswrapper[4985]: E0224 10:10:45.256997 4985 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 24 10:10:45 crc kubenswrapper[4985]: E0224 10:10:45.257046 4985 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-02-24 10:11:17.257033636 +0000 UTC m=+161.731226196 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 24 10:10:45 crc kubenswrapper[4985]: E0224 10:10:45.257082 4985 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Feb 24 10:10:45 crc kubenswrapper[4985]: E0224 10:10:45.257215 4985 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d4340d1a-60cb-4240-87ba-1e468c9c41cf-metrics-certs podName:d4340d1a-60cb-4240-87ba-1e468c9c41cf nodeName:}" failed. No retries permitted until 2026-02-24 10:11:17.2571811 +0000 UTC m=+161.731373700 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/d4340d1a-60cb-4240-87ba-1e468c9c41cf-metrics-certs") pod "network-metrics-daemon-xkc65" (UID: "d4340d1a-60cb-4240-87ba-1e468c9c41cf") : object "openshift-multus"/"metrics-daemon-secret" not registered Feb 24 10:10:45 crc kubenswrapper[4985]: I0224 10:10:45.263431 4985 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xkc65" Feb 24 10:10:45 crc kubenswrapper[4985]: I0224 10:10:45.263459 4985 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 24 10:10:45 crc kubenswrapper[4985]: E0224 10:10:45.263509 4985 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xkc65" podUID="d4340d1a-60cb-4240-87ba-1e468c9c41cf" Feb 24 10:10:45 crc kubenswrapper[4985]: I0224 10:10:45.263470 4985 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 24 10:10:45 crc kubenswrapper[4985]: I0224 10:10:45.263599 4985 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 24 10:10:45 crc kubenswrapper[4985]: E0224 10:10:45.263604 4985 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 24 10:10:45 crc kubenswrapper[4985]: E0224 10:10:45.263673 4985 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 24 10:10:45 crc kubenswrapper[4985]: E0224 10:10:45.263731 4985 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 24 10:10:46 crc kubenswrapper[4985]: I0224 10:10:46.279260 4985 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-xj9h5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c4bc690c-38e6-488d-97b2-9bf37a916fe4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ab4787f21b4cef49e96547a1d1e7105c79d01d7863afd7ea013f9a38cb40dd06\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:10:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svj5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5aa381e3ea366641ef4fa423fdd46f8afb44b1b2a2ccb6c754007456149b75e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5aa381e3ea366641ef4fa423fdd46f8afb44b1b2a2ccb6c754007456149b75e6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T10:10:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svj5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e071fe863c4e9eb6ebecb3d58d1c4a3c66577cbe46568a4d80a5de5663636539\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e071fe863c4e9eb6ebecb3d58d1c4a3c66577cbe46568a4d80a5de5663636539\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T10:10:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T10:10:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svj5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ac924a3bb0825085e397dbc788641894675e9556fd8aeadcbf9a24a65e1a076\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4ac924a3bb0825085e397dbc788641894675e9556fd8aeadcbf9a24a65e1a076\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T10:10:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T10:10:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svj5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://91acce3dbd19205b8faf23208b7d65c0c47d8156eaf6d7d3357ce1ca64a3bb88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://91acce3dbd19205b8faf23208b7d65c0c47d8156eaf6d7d3357ce1ca64a3bb88\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T10:10:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T10:10:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svj5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://81482396eb82b6ef2f09d03cf5e6bc785209394a50c63b199581fc70120bb9fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://81482396eb82b6ef2f09d03cf5e6bc785209394a50c63b199581fc70120bb9fd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T10:10:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T10:10:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svj5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://556b84b0e9e72b0c26c58defa2c030943daffa83d97b60293cc5b5627a6c9174\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://556b84b0e9e72b0c26c58defa2c030943daffa83d97b60293cc5b5627a6c9174\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T10:10:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T10:10:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svj5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:10:13Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-xj9h5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:10:46Z is after 2025-08-24T17:21:41Z" Feb 24 10:10:46 crc kubenswrapper[4985]: I0224 10:10:46.289467 4985 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-hq52w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"11c1c7b8-18df-4583-849f-76b62544344b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8546cf975b145528b7f94bd03c6570de0db1a059477da5795b7569250bde3ca5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:10:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kcfbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://14cb7316040a9cb3cb3236ed16ecc17eee99c6935a646b86604ea8285e1aab0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:10:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kcfbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:10:13Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-hq52w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:10:46Z is after 2025-08-24T17:21:41Z" Feb 24 10:10:46 crc kubenswrapper[4985]: I0224 10:10:46.299713 4985 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-jj7jq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4776cd42-9a5c-4c2e-a585-a1f4a49d2d6d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6f45d6400640ba9b59793abc9aee79bb1c32439ff377abdf8b4ec4d1cd7c336a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:10:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rvmf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:10:13Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-jj7jq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:10:46Z is after 2025-08-24T17:21:41Z" Feb 24 10:10:46 crc kubenswrapper[4985]: I0224 10:10:46.312023 4985 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a6395bdd-0d8e-4572-ae86-695e87aff12e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:08:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:08:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:08:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f9134e7cde3d6c701f928934ed4de91ee13bac8bc62dde5df6fe535a219fb57\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:08:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba53e3e1dfd92b0155e60254f21e53e06da9929560934d7bd95624f4a2bb619c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:08:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d059ea80117284da8c7eb82e4cd878eabba8f9d96102965708b9f6e0f8e96eb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:08:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dea2225b9a0112117f2f0bd6038ef636ac20e0cfcf3d608b720ad7aa2cacfa7c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6ee8b871ab25b79ee67f4e5735673469d19cfb336696170957d2439b0bca13af\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-24T10:09:32Z\\\",\\\"message\\\":\\\"le observer\\\\nW0224 10:09:31.800132 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0224 10:09:31.800281 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0224 10:09:31.801147 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1923306337/tls.crt::/tmp/serving-cert-1923306337/tls.key\\\\\\\"\\\\nI0224 10:09:31.990905 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0224 10:09:31.993717 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0224 10:09:31.993747 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0224 10:09:31.993768 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0224 10:09:31.993775 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0224 10:09:31.997392 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0224 10:09:31.997421 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0224 10:09:31.997427 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0224 10:09:31.997436 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nI0224 10:09:31.997435 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0224 10:09:31.997440 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0224 10:09:31.997467 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0224 10:09:31.997470 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0224 10:09:31.999573 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-24T10:09:31Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:10:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5cf6a4886c046added0a4380f1ca023def94c15e9d5c0ed472c714632684f158\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:08:39Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8da8e0d5478ae714a89c61cf4ca4142bc433bca4d4e4749f069c706abfed8c8f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8da8e0d5478ae714a89c61cf4ca4142bc433bca4d4e4749f069c706abfed8c8f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T10:08:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T10:08:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:08:36Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:10:46Z is after 2025-08-24T17:21:41Z" Feb 24 10:10:46 crc kubenswrapper[4985]: I0224 10:10:46.324694 4985 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://137c9b42aa2109e16d0e273e3e7d116db30ed716c1e6f78fb8f4e87409e3f252\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:10:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://40736fa6ee59064a7fc5bed9b08671d094e7a11b6a8bf4ec45220a0d62156542\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:10:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:10:46Z is after 2025-08-24T17:21:41Z" Feb 24 10:10:46 crc kubenswrapper[4985]: I0224 10:10:46.336140 4985 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:10:46Z is after 2025-08-24T17:21:41Z" Feb 24 10:10:46 crc kubenswrapper[4985]: I0224 10:10:46.355632 4985 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"42276a15-2a73-4687-91ad-d5ff65f8526d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:08:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:08:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:08:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:08:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:08:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://239bae44bdfe47737a87a7c7ffe103f138acfb89e1eb16c3901b25e8a0fa12e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:08:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aeb360380a66a7d4923e8db278af3a2024d5ed789e85e1427aaad600c2754224\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:08:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7968a5b1a705405cdca66b1c914475bd58b8c511980dfe312abd7ddf447d6375\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:08:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://813f41fed199f25206b4e475a175beaa3f3223c8b4f946e533f4290f64dbd069\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:08:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2a1fe7d1c64e5170ec0ce50343eb2a81b002e5b8bb4160c5ab4abafcc4ece67b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:08:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f5db1ab98b50bb081b07ffa0a8660ef886047d941d9b8509f484011950d0a94a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f5db1ab98b50bb081b07ffa0a8660ef886047d941d9b8509f484011950d0a94a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T10:08:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T10:08:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9edf4a1146efaf26193d3f8c63eb95cd945e7be5d36dc825d24053417ac5b1ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9edf4a1146efaf26193d3f8c63eb95cd945e7be5d36dc825d24053417ac5b1ac\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T10:08:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T10:08:38Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://2592e50d0dc75b9c89a94486e27887c6487e21a72ad53055b1b9af0cdfd73ec9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2592e50d0dc75b9c89a94486e27887c6487e21a72ad53055b1b9af0cdfd73ec9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T10:08:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T10:08:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:08:36Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:10:46Z is after 2025-08-24T17:21:41Z" Feb 24 10:10:46 crc kubenswrapper[4985]: E0224 10:10:46.364507 4985 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 24 10:10:46 crc kubenswrapper[4985]: I0224 10:10:46.370830 4985 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c9e0873cd1521fe3cd14005f22118db4616043670f4b895435e71ddec9d82a0e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:10:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:10:46Z is after 2025-08-24T17:21:41Z" Feb 24 10:10:46 crc kubenswrapper[4985]: I0224 10:10:46.384397 4985 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:16Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:16Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e2f9d78f6a52c55d1f70fa119f84fdb11505d511d394a8ed61a298e79dfb2dcd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:10:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:10:46Z is after 2025-08-24T17:21:41Z" Feb 24 10:10:46 crc kubenswrapper[4985]: I0224 10:10:46.396160 4985 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-xkc65" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d4340d1a-60cb-4240-87ba-1e468c9c41cf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lvcfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lvcfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:10:13Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-xkc65\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:10:46Z is after 2025-08-24T17:21:41Z" Feb 24 10:10:46 crc kubenswrapper[4985]: I0224 10:10:46.413492 4985 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-27dpt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1b3986ef-e9be-43db-9350-ccc7dd3f713f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://97f91c2ddd00ceb58ea5691c7cb057bd7cfe8add5fd1dfd334f39b9def30db0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:10:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9n6c9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://af211a7b5dbd1e1f86f340d63afe03ad2f9691d166810837cc5988a24ad4f323\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:10:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9n6c9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3d76cc1e57df7809a67ce69b6690fd8cbe0ef9c364406b03bbb37bd110d65866\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:10:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9n6c9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cb7b5b1f27320e4bc00e48154c3bb41b1153e36c39a404cfc1d4a246f5909c5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:10:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9n6c9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e04b766ff2b4eb366ace7cbf978ef84bfa922494ec3d28f4cbd75efe1cb54f1e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:10:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9n6c9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7630d2e046fcee78924010a8de1aefcdf7c24ef4e70c4ea89c193983a0d88a61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:10:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9n6c9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0aec8312b8a8ac9d698c0e1253f847faf52c391d69d8276e73b1e774a6819807\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0aec8312b8a8ac9d698c0e1253f847faf52c391d69d8276e73b1e774a6819807\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-24T10:10:41Z\\\",\\\"message\\\":\\\"ft-etcd/etcd-crc after 0 failed attempt(s)\\\\nI0224 10:10:41.298714 7041 default_network_controller.go:776] Recording success event on pod openshift-etcd/etcd-crc\\\\nI0224 10:10:41.298592 7041 ovnkube.go:599] Stopped ovnkube\\\\nI0224 10:10:41.298707 7041 transact.go:42] Configuring OVN: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-marketplace/certified-operators]} name:Service_openshift-marketplace/certified-operators_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.214:50051:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {20da2226-531c-4179-9810-aa4026995ca3}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0224 10:10:41.298743 7041 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI0224 10:10:41.298781 7041 loadbalancer.go:304] Deleted 0 stale LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-kube-controller-manager/kube-controller-manager\\\\\\\"}\\\\nF0224 10:10:41.298810 7041 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-24T10:10:40Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-27dpt_openshift-ovn-kubernetes(1b3986ef-e9be-43db-9350-ccc7dd3f713f)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9n6c9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e3faa6c1361aca1d102b6dbfaf38fa3094a5c2334713ce70a8bf2f58f19d6392\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:10:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9n6c9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8de70126733b3634f4e3c1576c3095c089e9628a6147ce8ea79fa5242a2eb920\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8de70126733b3634f4e3c1576c3095c089e9628a6147ce8ea79fa5242a2eb920\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T10:10:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9n6c9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:10:13Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-27dpt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:10:46Z is after 2025-08-24T17:21:41Z" Feb 24 10:10:46 crc kubenswrapper[4985]: I0224 10:10:46.423591 4985 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-qtqms" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c43e3252-1b22-48a0-8895-ad98fc88b7cf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://778506fd4d9af7b6ebd4e4d46282938146ba0184d9747ca657957c78118c10e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:10:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jdldr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:10:13Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-qtqms\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:10:46Z is after 2025-08-24T17:21:41Z" Feb 24 10:10:46 crc kubenswrapper[4985]: I0224 10:10:46.436130 4985 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"71a2eb97-3905-45f5-8448-ab8a6e27e0fb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:08:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:08:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:09:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:09:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:08:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://30ed8363bc5fff9f3b57a12ef0133eae4aa1447b427d1fc98e49e4155a154dab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:08:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://60b8a045a92dbfe12e3eed0e5ec7fda3284a16cb5a1ab0296ed1bd44c427f4b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:08:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6163846a3f1d508e3ea6bad31b1818569d4339c10d24284aa2360a98f4f78351\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:08:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6388ea35afa9218a9ebe327d6c6a9b35c7b92bd77a1f95e4e569ba759ec9aeb2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6388ea35afa9218a9ebe327d6c6a9b35c7b92bd77a1f95e4e569ba759ec9aeb2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T10:08:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T10:08:37Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:08:36Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:10:46Z is after 2025-08-24T17:21:41Z" Feb 24 10:10:46 crc kubenswrapper[4985]: I0224 10:10:46.447617 4985 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:10:46Z is after 2025-08-24T17:21:41Z" Feb 24 10:10:46 crc kubenswrapper[4985]: I0224 10:10:46.459506 4985 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:10:46Z is after 2025-08-24T17:21:41Z" Feb 24 10:10:46 crc kubenswrapper[4985]: I0224 10:10:46.475006 4985 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-q24bf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"731349d2-7b07-4bc9-81f8-c7d75bca842a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://85e965ad144b3440864b3c9de70aacd2a2e49c715dca204b0e4cfae65954cd9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:10:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5g9sh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:10:13Z\\\"}}\" for pod \"openshift-multus\"/\"multus-q24bf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:10:46Z is after 2025-08-24T17:21:41Z" Feb 24 10:10:46 crc kubenswrapper[4985]: I0224 10:10:46.488172 4985 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-kkmm8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c3a34c00-910b-400b-bc96-9d805e076b7c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://65682d7b02ccbc0c344ab0b8eaf55d227d0b6e4a04e56d1038d51aa30747b068\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:10:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2zg8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://52f2dbaf1dfa5fd40bbf8a0d82613a96b05afccc71653f6d1f24db6674950b58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:10:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2zg8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:10:13Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-kkmm8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:10:46Z is after 2025-08-24T17:21:41Z" Feb 24 10:10:46 crc kubenswrapper[4985]: I0224 10:10:46.501336 4985 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f25863d6-c7fb-47ed-98bf-1e51a5d31280\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:08:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:08:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:09:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:09:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:08:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a122488c66088f525d614bb431c3211b6c4cc2e607703b5b4ab70ed998df6bfd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://80172a23d36c0c1ea92ff824608237fed22c6e46e43ac33db31a9aff8d181b2b\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-24T10:09:09Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0224 10:08:38.386655 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0224 10:08:38.388712 1 observer_polling.go:159] Starting file observer\\\\nI0224 10:08:38.420524 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0224 10:08:38.423674 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nF0224 10:09:09.020819 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-24T10:08:37Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:09:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://171f1effb015bd364e8afd406c496b6312f61c665b54b95b11fb5b203fca4c7e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:08:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://386ac8359b4df2fa31547b13090041850a839a61e549ff58ef8d5c33476e6e14\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:08:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d21b46ed1562aa1c67d31b5ed21686d633a5c8758c65d02566833c8c2b34d6fa\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:08:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:08:36Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:10:46Z is after 2025-08-24T17:21:41Z" Feb 24 10:10:47 crc kubenswrapper[4985]: I0224 10:10:47.264420 4985 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xkc65" Feb 24 10:10:47 crc kubenswrapper[4985]: I0224 10:10:47.264432 4985 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 24 10:10:47 crc kubenswrapper[4985]: I0224 10:10:47.264492 4985 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 24 10:10:47 crc kubenswrapper[4985]: I0224 10:10:47.264564 4985 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 24 10:10:47 crc kubenswrapper[4985]: E0224 10:10:47.264714 4985 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xkc65" podUID="d4340d1a-60cb-4240-87ba-1e468c9c41cf" Feb 24 10:10:47 crc kubenswrapper[4985]: E0224 10:10:47.264873 4985 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 24 10:10:47 crc kubenswrapper[4985]: E0224 10:10:47.265094 4985 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 24 10:10:47 crc kubenswrapper[4985]: E0224 10:10:47.265243 4985 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 24 10:10:47 crc kubenswrapper[4985]: I0224 10:10:47.859139 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:10:47 crc kubenswrapper[4985]: I0224 10:10:47.859177 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:10:47 crc kubenswrapper[4985]: I0224 10:10:47.859185 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:10:47 crc kubenswrapper[4985]: I0224 10:10:47.859199 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 10:10:47 crc kubenswrapper[4985]: I0224 10:10:47.859208 4985 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T10:10:47Z","lastTransitionTime":"2026-02-24T10:10:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 10:10:47 crc kubenswrapper[4985]: E0224 10:10:47.871514 4985 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T10:10:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:47Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T10:10:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:47Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T10:10:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:47Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T10:10:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:47Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"6fa96d71-6980-4f45-b02a-32602082091f\\\",\\\"systemUUID\\\":\\\"77848161-2cfb-4f0a-8e60-0ac9426710fd\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:10:47Z is after 2025-08-24T17:21:41Z" Feb 24 10:10:47 crc kubenswrapper[4985]: I0224 10:10:47.876125 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:10:47 crc kubenswrapper[4985]: I0224 10:10:47.876182 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:10:47 crc kubenswrapper[4985]: I0224 10:10:47.876203 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:10:47 crc kubenswrapper[4985]: I0224 10:10:47.876229 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 10:10:47 crc kubenswrapper[4985]: I0224 10:10:47.876247 4985 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T10:10:47Z","lastTransitionTime":"2026-02-24T10:10:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 10:10:47 crc kubenswrapper[4985]: E0224 10:10:47.895882 4985 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T10:10:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:47Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T10:10:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:47Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T10:10:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:47Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T10:10:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:47Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"6fa96d71-6980-4f45-b02a-32602082091f\\\",\\\"systemUUID\\\":\\\"77848161-2cfb-4f0a-8e60-0ac9426710fd\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:10:47Z is after 2025-08-24T17:21:41Z" Feb 24 10:10:47 crc kubenswrapper[4985]: I0224 10:10:47.900560 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:10:47 crc kubenswrapper[4985]: I0224 10:10:47.900592 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:10:47 crc kubenswrapper[4985]: I0224 10:10:47.900601 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:10:47 crc kubenswrapper[4985]: I0224 10:10:47.900616 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 10:10:47 crc kubenswrapper[4985]: I0224 10:10:47.900625 4985 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T10:10:47Z","lastTransitionTime":"2026-02-24T10:10:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 10:10:47 crc kubenswrapper[4985]: E0224 10:10:47.913599 4985 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T10:10:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:47Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T10:10:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:47Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T10:10:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:47Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T10:10:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:47Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"6fa96d71-6980-4f45-b02a-32602082091f\\\",\\\"systemUUID\\\":\\\"77848161-2cfb-4f0a-8e60-0ac9426710fd\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:10:47Z is after 2025-08-24T17:21:41Z" Feb 24 10:10:47 crc kubenswrapper[4985]: I0224 10:10:47.918527 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:10:47 crc kubenswrapper[4985]: I0224 10:10:47.918594 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:10:47 crc kubenswrapper[4985]: I0224 10:10:47.918608 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:10:47 crc kubenswrapper[4985]: I0224 10:10:47.918633 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 10:10:47 crc kubenswrapper[4985]: I0224 10:10:47.918647 4985 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T10:10:47Z","lastTransitionTime":"2026-02-24T10:10:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 10:10:47 crc kubenswrapper[4985]: E0224 10:10:47.937657 4985 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T10:10:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:47Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T10:10:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:47Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T10:10:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:47Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T10:10:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:47Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"6fa96d71-6980-4f45-b02a-32602082091f\\\",\\\"systemUUID\\\":\\\"77848161-2cfb-4f0a-8e60-0ac9426710fd\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:10:47Z is after 2025-08-24T17:21:41Z" Feb 24 10:10:47 crc kubenswrapper[4985]: I0224 10:10:47.943195 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:10:47 crc kubenswrapper[4985]: I0224 10:10:47.943278 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:10:47 crc kubenswrapper[4985]: I0224 10:10:47.943307 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:10:47 crc kubenswrapper[4985]: I0224 10:10:47.943343 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 10:10:47 crc kubenswrapper[4985]: I0224 10:10:47.943364 4985 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T10:10:47Z","lastTransitionTime":"2026-02-24T10:10:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 10:10:47 crc kubenswrapper[4985]: E0224 10:10:47.959109 4985 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T10:10:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:47Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T10:10:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:47Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T10:10:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:47Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T10:10:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:47Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"6fa96d71-6980-4f45-b02a-32602082091f\\\",\\\"systemUUID\\\":\\\"77848161-2cfb-4f0a-8e60-0ac9426710fd\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:10:47Z is after 2025-08-24T17:21:41Z" Feb 24 10:10:47 crc kubenswrapper[4985]: E0224 10:10:47.959251 4985 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Feb 24 10:10:49 crc kubenswrapper[4985]: I0224 10:10:49.264573 4985 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 24 10:10:49 crc kubenswrapper[4985]: I0224 10:10:49.264630 4985 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 24 10:10:49 crc kubenswrapper[4985]: I0224 10:10:49.264604 4985 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xkc65" Feb 24 10:10:49 crc kubenswrapper[4985]: I0224 10:10:49.264596 4985 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 24 10:10:49 crc kubenswrapper[4985]: E0224 10:10:49.264980 4985 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 24 10:10:49 crc kubenswrapper[4985]: E0224 10:10:49.265130 4985 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xkc65" podUID="d4340d1a-60cb-4240-87ba-1e468c9c41cf" Feb 24 10:10:49 crc kubenswrapper[4985]: E0224 10:10:49.264756 4985 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 24 10:10:49 crc kubenswrapper[4985]: E0224 10:10:49.265303 4985 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 24 10:10:51 crc kubenswrapper[4985]: I0224 10:10:51.263946 4985 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xkc65" Feb 24 10:10:51 crc kubenswrapper[4985]: I0224 10:10:51.263982 4985 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 24 10:10:51 crc kubenswrapper[4985]: E0224 10:10:51.264154 4985 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xkc65" podUID="d4340d1a-60cb-4240-87ba-1e468c9c41cf" Feb 24 10:10:51 crc kubenswrapper[4985]: E0224 10:10:51.264394 4985 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 24 10:10:51 crc kubenswrapper[4985]: I0224 10:10:51.264576 4985 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 24 10:10:51 crc kubenswrapper[4985]: I0224 10:10:51.264711 4985 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 24 10:10:51 crc kubenswrapper[4985]: E0224 10:10:51.264770 4985 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 24 10:10:51 crc kubenswrapper[4985]: E0224 10:10:51.265027 4985 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 24 10:10:51 crc kubenswrapper[4985]: E0224 10:10:51.366770 4985 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 24 10:10:53 crc kubenswrapper[4985]: I0224 10:10:53.264160 4985 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 24 10:10:53 crc kubenswrapper[4985]: I0224 10:10:53.264273 4985 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 24 10:10:53 crc kubenswrapper[4985]: I0224 10:10:53.264268 4985 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xkc65" Feb 24 10:10:53 crc kubenswrapper[4985]: I0224 10:10:53.264364 4985 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 24 10:10:53 crc kubenswrapper[4985]: E0224 10:10:53.264381 4985 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 24 10:10:53 crc kubenswrapper[4985]: E0224 10:10:53.264453 4985 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xkc65" podUID="d4340d1a-60cb-4240-87ba-1e468c9c41cf" Feb 24 10:10:53 crc kubenswrapper[4985]: E0224 10:10:53.264571 4985 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 24 10:10:53 crc kubenswrapper[4985]: E0224 10:10:53.264642 4985 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 24 10:10:54 crc kubenswrapper[4985]: I0224 10:10:54.265247 4985 scope.go:117] "RemoveContainer" containerID="0aec8312b8a8ac9d698c0e1253f847faf52c391d69d8276e73b1e774a6819807" Feb 24 10:10:54 crc kubenswrapper[4985]: E0224 10:10:54.265553 4985 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-27dpt_openshift-ovn-kubernetes(1b3986ef-e9be-43db-9350-ccc7dd3f713f)\"" pod="openshift-ovn-kubernetes/ovnkube-node-27dpt" podUID="1b3986ef-e9be-43db-9350-ccc7dd3f713f" Feb 24 10:10:55 crc kubenswrapper[4985]: I0224 10:10:55.264095 4985 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 24 10:10:55 crc kubenswrapper[4985]: I0224 10:10:55.264195 4985 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xkc65" Feb 24 10:10:55 crc kubenswrapper[4985]: I0224 10:10:55.264272 4985 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 24 10:10:55 crc kubenswrapper[4985]: E0224 10:10:55.264391 4985 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 24 10:10:55 crc kubenswrapper[4985]: I0224 10:10:55.264471 4985 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 24 10:10:55 crc kubenswrapper[4985]: E0224 10:10:55.264692 4985 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 24 10:10:55 crc kubenswrapper[4985]: E0224 10:10:55.264818 4985 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 24 10:10:55 crc kubenswrapper[4985]: E0224 10:10:55.264996 4985 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xkc65" podUID="d4340d1a-60cb-4240-87ba-1e468c9c41cf" Feb 24 10:10:56 crc kubenswrapper[4985]: I0224 10:10:56.301658 4985 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a6395bdd-0d8e-4572-ae86-695e87aff12e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:08:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:08:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:08:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f9134e7cde3d6c701f928934ed4de91ee13bac8bc62dde5df6fe535a219fb57\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:08:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba53e3e1dfd92b0155e60254f21e53e06da9929560934d7bd95624f4a2bb619c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:08:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d059ea80117284da8c7eb82e4cd878eabba8f9d96102965708b9f6e0f8e96eb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:08:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dea2225b9a0112117f2f0bd6038ef636ac20e0cfcf3d608b720ad7aa2cacfa7c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6ee8b871ab25b79ee67f4e5735673469d19cfb336696170957d2439b0bca13af\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-24T10:09:32Z\\\",\\\"message\\\":\\\"le observer\\\\nW0224 10:09:31.800132 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0224 10:09:31.800281 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0224 10:09:31.801147 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1923306337/tls.crt::/tmp/serving-cert-1923306337/tls.key\\\\\\\"\\\\nI0224 10:09:31.990905 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0224 10:09:31.993717 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0224 10:09:31.993747 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0224 10:09:31.993768 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0224 10:09:31.993775 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0224 10:09:31.997392 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0224 10:09:31.997421 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0224 10:09:31.997427 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0224 10:09:31.997436 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nI0224 10:09:31.997435 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0224 10:09:31.997440 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0224 10:09:31.997467 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0224 10:09:31.997470 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0224 10:09:31.999573 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-24T10:09:31Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:10:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5cf6a4886c046added0a4380f1ca023def94c15e9d5c0ed472c714632684f158\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:08:39Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8da8e0d5478ae714a89c61cf4ca4142bc433bca4d4e4749f069c706abfed8c8f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8da8e0d5478ae714a89c61cf4ca4142bc433bca4d4e4749f069c706abfed8c8f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T10:08:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T10:08:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:08:36Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:10:56Z is after 2025-08-24T17:21:41Z" Feb 24 10:10:56 crc kubenswrapper[4985]: I0224 10:10:56.331971 4985 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-xj9h5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c4bc690c-38e6-488d-97b2-9bf37a916fe4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ab4787f21b4cef49e96547a1d1e7105c79d01d7863afd7ea013f9a38cb40dd06\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:10:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svj5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5aa381e3ea366641ef4fa423fdd46f8afb44b1b2a2ccb6c754007456149b75e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5aa381e3ea366641ef4fa423fdd46f8afb44b1b2a2ccb6c754007456149b75e6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T10:10:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svj5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e071fe863c4e9eb6ebecb3d58d1c4a3c66577cbe46568a4d80a5de5663636539\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e071fe863c4e9eb6ebecb3d58d1c4a3c66577cbe46568a4d80a5de5663636539\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T10:10:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T10:10:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svj5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ac924a3bb0825085e397dbc788641894675e9556fd8aeadcbf9a24a65e1a076\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4ac924a3bb0825085e397dbc788641894675e9556fd8aeadcbf9a24a65e1a076\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T10:10:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T10:10:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svj5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://91acce3dbd19205b8faf23208b7d65c0c47d8156eaf6d7d3357ce1ca64a3bb88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://91acce3dbd19205b8faf23208b7d65c0c47d8156eaf6d7d3357ce1ca64a3bb88\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T10:10:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T10:10:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svj5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://81482396eb82b6ef2f09d03cf5e6bc785209394a50c63b199581fc70120bb9fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://81482396eb82b6ef2f09d03cf5e6bc785209394a50c63b199581fc70120bb9fd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T10:10:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T10:10:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svj5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://556b84b0e9e72b0c26c58defa2c030943daffa83d97b60293cc5b5627a6c9174\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://556b84b0e9e72b0c26c58defa2c030943daffa83d97b60293cc5b5627a6c9174\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T10:10:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T10:10:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svj5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:10:13Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-xj9h5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:10:56Z is after 2025-08-24T17:21:41Z" Feb 24 10:10:56 crc kubenswrapper[4985]: I0224 10:10:56.355822 4985 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-hq52w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"11c1c7b8-18df-4583-849f-76b62544344b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8546cf975b145528b7f94bd03c6570de0db1a059477da5795b7569250bde3ca5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:10:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kcfbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://14cb7316040a9cb3cb3236ed16ecc17eee99c6935a646b86604ea8285e1aab0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:10:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kcfbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:10:13Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-hq52w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:10:56Z is after 2025-08-24T17:21:41Z" Feb 24 10:10:56 crc kubenswrapper[4985]: E0224 10:10:56.367856 4985 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 24 10:10:56 crc kubenswrapper[4985]: I0224 10:10:56.374543 4985 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-jj7jq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4776cd42-9a5c-4c2e-a585-a1f4a49d2d6d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6f45d6400640ba9b59793abc9aee79bb1c32439ff377abdf8b4ec4d1cd7c336a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:10:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rvmf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:10:13Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-jj7jq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:10:56Z is after 2025-08-24T17:21:41Z" Feb 24 10:10:56 crc kubenswrapper[4985]: I0224 10:10:56.391278 4985 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:10:56Z is after 2025-08-24T17:21:41Z" Feb 24 10:10:56 crc kubenswrapper[4985]: I0224 10:10:56.408605 4985 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://137c9b42aa2109e16d0e273e3e7d116db30ed716c1e6f78fb8f4e87409e3f252\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:10:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://40736fa6ee59064a7fc5bed9b08671d094e7a11b6a8bf4ec45220a0d62156542\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:10:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:10:56Z is after 2025-08-24T17:21:41Z" Feb 24 10:10:56 crc kubenswrapper[4985]: I0224 10:10:56.424788 4985 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-qtqms" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c43e3252-1b22-48a0-8895-ad98fc88b7cf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://778506fd4d9af7b6ebd4e4d46282938146ba0184d9747ca657957c78118c10e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:10:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jdldr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:10:13Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-qtqms\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:10:56Z is after 2025-08-24T17:21:41Z" Feb 24 10:10:56 crc kubenswrapper[4985]: I0224 10:10:56.439245 4985 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"71a2eb97-3905-45f5-8448-ab8a6e27e0fb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:08:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:08:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:09:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:09:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:08:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://30ed8363bc5fff9f3b57a12ef0133eae4aa1447b427d1fc98e49e4155a154dab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:08:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://60b8a045a92dbfe12e3eed0e5ec7fda3284a16cb5a1ab0296ed1bd44c427f4b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:08:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6163846a3f1d508e3ea6bad31b1818569d4339c10d24284aa2360a98f4f78351\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:08:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6388ea35afa9218a9ebe327d6c6a9b35c7b92bd77a1f95e4e569ba759ec9aeb2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6388ea35afa9218a9ebe327d6c6a9b35c7b92bd77a1f95e4e569ba759ec9aeb2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T10:08:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T10:08:37Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:08:36Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:10:56Z is after 2025-08-24T17:21:41Z" Feb 24 10:10:56 crc kubenswrapper[4985]: I0224 10:10:56.462258 4985 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"42276a15-2a73-4687-91ad-d5ff65f8526d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:08:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:08:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:08:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:08:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:08:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://239bae44bdfe47737a87a7c7ffe103f138acfb89e1eb16c3901b25e8a0fa12e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:08:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aeb360380a66a7d4923e8db278af3a2024d5ed789e85e1427aaad600c2754224\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:08:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7968a5b1a705405cdca66b1c914475bd58b8c511980dfe312abd7ddf447d6375\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:08:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://813f41fed199f25206b4e475a175beaa3f3223c8b4f946e533f4290f64dbd069\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:08:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2a1fe7d1c64e5170ec0ce50343eb2a81b002e5b8bb4160c5ab4abafcc4ece67b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:08:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f5db1ab98b50bb081b07ffa0a8660ef886047d941d9b8509f484011950d0a94a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f5db1ab98b50bb081b07ffa0a8660ef886047d941d9b8509f484011950d0a94a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T10:08:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T10:08:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9edf4a1146efaf26193d3f8c63eb95cd945e7be5d36dc825d24053417ac5b1ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9edf4a1146efaf26193d3f8c63eb95cd945e7be5d36dc825d24053417ac5b1ac\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T10:08:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T10:08:38Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://2592e50d0dc75b9c89a94486e27887c6487e21a72ad53055b1b9af0cdfd73ec9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2592e50d0dc75b9c89a94486e27887c6487e21a72ad53055b1b9af0cdfd73ec9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T10:08:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T10:08:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:08:36Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:10:56Z is after 2025-08-24T17:21:41Z" Feb 24 10:10:56 crc kubenswrapper[4985]: I0224 10:10:56.481467 4985 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c9e0873cd1521fe3cd14005f22118db4616043670f4b895435e71ddec9d82a0e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:10:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:10:56Z is after 2025-08-24T17:21:41Z" Feb 24 10:10:56 crc kubenswrapper[4985]: I0224 10:10:56.498342 4985 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:16Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:16Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e2f9d78f6a52c55d1f70fa119f84fdb11505d511d394a8ed61a298e79dfb2dcd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:10:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:10:56Z is after 2025-08-24T17:21:41Z" Feb 24 10:10:56 crc kubenswrapper[4985]: I0224 10:10:56.509589 4985 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-xkc65" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d4340d1a-60cb-4240-87ba-1e468c9c41cf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lvcfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lvcfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:10:13Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-xkc65\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:10:56Z is after 2025-08-24T17:21:41Z" Feb 24 10:10:56 crc kubenswrapper[4985]: I0224 10:10:56.532875 4985 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-27dpt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1b3986ef-e9be-43db-9350-ccc7dd3f713f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://97f91c2ddd00ceb58ea5691c7cb057bd7cfe8add5fd1dfd334f39b9def30db0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:10:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9n6c9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://af211a7b5dbd1e1f86f340d63afe03ad2f9691d166810837cc5988a24ad4f323\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:10:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9n6c9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3d76cc1e57df7809a67ce69b6690fd8cbe0ef9c364406b03bbb37bd110d65866\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:10:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9n6c9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cb7b5b1f27320e4bc00e48154c3bb41b1153e36c39a404cfc1d4a246f5909c5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:10:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9n6c9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e04b766ff2b4eb366ace7cbf978ef84bfa922494ec3d28f4cbd75efe1cb54f1e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:10:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9n6c9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7630d2e046fcee78924010a8de1aefcdf7c24ef4e70c4ea89c193983a0d88a61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:10:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9n6c9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0aec8312b8a8ac9d698c0e1253f847faf52c391d69d8276e73b1e774a6819807\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0aec8312b8a8ac9d698c0e1253f847faf52c391d69d8276e73b1e774a6819807\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-24T10:10:41Z\\\",\\\"message\\\":\\\"ft-etcd/etcd-crc after 0 failed attempt(s)\\\\nI0224 10:10:41.298714 7041 default_network_controller.go:776] Recording success event on pod openshift-etcd/etcd-crc\\\\nI0224 10:10:41.298592 7041 ovnkube.go:599] Stopped ovnkube\\\\nI0224 10:10:41.298707 7041 transact.go:42] Configuring OVN: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-marketplace/certified-operators]} name:Service_openshift-marketplace/certified-operators_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.214:50051:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {20da2226-531c-4179-9810-aa4026995ca3}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0224 10:10:41.298743 7041 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI0224 10:10:41.298781 7041 loadbalancer.go:304] Deleted 0 stale LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-kube-controller-manager/kube-controller-manager\\\\\\\"}\\\\nF0224 10:10:41.298810 7041 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-24T10:10:40Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-27dpt_openshift-ovn-kubernetes(1b3986ef-e9be-43db-9350-ccc7dd3f713f)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9n6c9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e3faa6c1361aca1d102b6dbfaf38fa3094a5c2334713ce70a8bf2f58f19d6392\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:10:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9n6c9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8de70126733b3634f4e3c1576c3095c089e9628a6147ce8ea79fa5242a2eb920\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8de70126733b3634f4e3c1576c3095c089e9628a6147ce8ea79fa5242a2eb920\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T10:10:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9n6c9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:10:13Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-27dpt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:10:56Z is after 2025-08-24T17:21:41Z" Feb 24 10:10:56 crc kubenswrapper[4985]: I0224 10:10:56.549372 4985 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f25863d6-c7fb-47ed-98bf-1e51a5d31280\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:08:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:08:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:09:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:09:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:08:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a122488c66088f525d614bb431c3211b6c4cc2e607703b5b4ab70ed998df6bfd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://80172a23d36c0c1ea92ff824608237fed22c6e46e43ac33db31a9aff8d181b2b\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-24T10:09:09Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0224 10:08:38.386655 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0224 10:08:38.388712 1 observer_polling.go:159] Starting file observer\\\\nI0224 10:08:38.420524 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0224 10:08:38.423674 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nF0224 10:09:09.020819 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-24T10:08:37Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:09:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://171f1effb015bd364e8afd406c496b6312f61c665b54b95b11fb5b203fca4c7e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:08:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://386ac8359b4df2fa31547b13090041850a839a61e549ff58ef8d5c33476e6e14\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:08:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d21b46ed1562aa1c67d31b5ed21686d633a5c8758c65d02566833c8c2b34d6fa\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:08:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:08:36Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:10:56Z is after 2025-08-24T17:21:41Z" Feb 24 10:10:56 crc kubenswrapper[4985]: I0224 10:10:56.561377 4985 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:10:56Z is after 2025-08-24T17:21:41Z" Feb 24 10:10:56 crc kubenswrapper[4985]: I0224 10:10:56.578599 4985 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:10:56Z is after 2025-08-24T17:21:41Z" Feb 24 10:10:56 crc kubenswrapper[4985]: I0224 10:10:56.596840 4985 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-q24bf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"731349d2-7b07-4bc9-81f8-c7d75bca842a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://85e965ad144b3440864b3c9de70aacd2a2e49c715dca204b0e4cfae65954cd9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:10:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5g9sh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:10:13Z\\\"}}\" for pod \"openshift-multus\"/\"multus-q24bf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:10:56Z is after 2025-08-24T17:21:41Z" Feb 24 10:10:56 crc kubenswrapper[4985]: I0224 10:10:56.617613 4985 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-kkmm8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c3a34c00-910b-400b-bc96-9d805e076b7c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://65682d7b02ccbc0c344ab0b8eaf55d227d0b6e4a04e56d1038d51aa30747b068\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:10:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2zg8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://52f2dbaf1dfa5fd40bbf8a0d82613a96b05afccc71653f6d1f24db6674950b58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:10:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2zg8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:10:13Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-kkmm8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:10:56Z is after 2025-08-24T17:21:41Z" Feb 24 10:10:57 crc kubenswrapper[4985]: I0224 10:10:57.263863 4985 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 24 10:10:57 crc kubenswrapper[4985]: I0224 10:10:57.263982 4985 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xkc65" Feb 24 10:10:57 crc kubenswrapper[4985]: I0224 10:10:57.264045 4985 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 24 10:10:57 crc kubenswrapper[4985]: E0224 10:10:57.264197 4985 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 24 10:10:57 crc kubenswrapper[4985]: I0224 10:10:57.264222 4985 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 24 10:10:57 crc kubenswrapper[4985]: E0224 10:10:57.264357 4985 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 24 10:10:57 crc kubenswrapper[4985]: E0224 10:10:57.264624 4985 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xkc65" podUID="d4340d1a-60cb-4240-87ba-1e468c9c41cf" Feb 24 10:10:57 crc kubenswrapper[4985]: E0224 10:10:57.264863 4985 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 24 10:10:58 crc kubenswrapper[4985]: I0224 10:10:58.039070 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:10:58 crc kubenswrapper[4985]: I0224 10:10:58.039199 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:10:58 crc kubenswrapper[4985]: I0224 10:10:58.039220 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:10:58 crc kubenswrapper[4985]: I0224 10:10:58.039240 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 10:10:58 crc kubenswrapper[4985]: I0224 10:10:58.039253 4985 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T10:10:58Z","lastTransitionTime":"2026-02-24T10:10:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 10:10:58 crc kubenswrapper[4985]: E0224 10:10:58.053518 4985 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T10:10:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:58Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T10:10:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:58Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T10:10:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:58Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T10:10:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:58Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"6fa96d71-6980-4f45-b02a-32602082091f\\\",\\\"systemUUID\\\":\\\"77848161-2cfb-4f0a-8e60-0ac9426710fd\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:10:58Z is after 2025-08-24T17:21:41Z" Feb 24 10:10:58 crc kubenswrapper[4985]: I0224 10:10:58.057260 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:10:58 crc kubenswrapper[4985]: I0224 10:10:58.057300 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:10:58 crc kubenswrapper[4985]: I0224 10:10:58.057313 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:10:58 crc kubenswrapper[4985]: I0224 10:10:58.057328 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 10:10:58 crc kubenswrapper[4985]: I0224 10:10:58.057337 4985 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T10:10:58Z","lastTransitionTime":"2026-02-24T10:10:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 10:10:58 crc kubenswrapper[4985]: E0224 10:10:58.070796 4985 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T10:10:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:58Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T10:10:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:58Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T10:10:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:58Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T10:10:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:58Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"6fa96d71-6980-4f45-b02a-32602082091f\\\",\\\"systemUUID\\\":\\\"77848161-2cfb-4f0a-8e60-0ac9426710fd\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:10:58Z is after 2025-08-24T17:21:41Z" Feb 24 10:10:58 crc kubenswrapper[4985]: I0224 10:10:58.074682 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:10:58 crc kubenswrapper[4985]: I0224 10:10:58.074712 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:10:58 crc kubenswrapper[4985]: I0224 10:10:58.074722 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:10:58 crc kubenswrapper[4985]: I0224 10:10:58.074737 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 10:10:58 crc kubenswrapper[4985]: I0224 10:10:58.074747 4985 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T10:10:58Z","lastTransitionTime":"2026-02-24T10:10:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 10:10:58 crc kubenswrapper[4985]: E0224 10:10:58.087036 4985 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T10:10:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:58Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T10:10:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:58Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T10:10:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:58Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T10:10:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:58Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"6fa96d71-6980-4f45-b02a-32602082091f\\\",\\\"systemUUID\\\":\\\"77848161-2cfb-4f0a-8e60-0ac9426710fd\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:10:58Z is after 2025-08-24T17:21:41Z" Feb 24 10:10:58 crc kubenswrapper[4985]: I0224 10:10:58.096213 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:10:58 crc kubenswrapper[4985]: I0224 10:10:58.096273 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:10:58 crc kubenswrapper[4985]: I0224 10:10:58.096290 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:10:58 crc kubenswrapper[4985]: I0224 10:10:58.096309 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 10:10:58 crc kubenswrapper[4985]: I0224 10:10:58.096322 4985 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T10:10:58Z","lastTransitionTime":"2026-02-24T10:10:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 10:10:58 crc kubenswrapper[4985]: E0224 10:10:58.109309 4985 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T10:10:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:58Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T10:10:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:58Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T10:10:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:58Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T10:10:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:58Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"6fa96d71-6980-4f45-b02a-32602082091f\\\",\\\"systemUUID\\\":\\\"77848161-2cfb-4f0a-8e60-0ac9426710fd\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:10:58Z is after 2025-08-24T17:21:41Z" Feb 24 10:10:58 crc kubenswrapper[4985]: I0224 10:10:58.112535 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:10:58 crc kubenswrapper[4985]: I0224 10:10:58.112580 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:10:58 crc kubenswrapper[4985]: I0224 10:10:58.112603 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:10:58 crc kubenswrapper[4985]: I0224 10:10:58.112623 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 10:10:58 crc kubenswrapper[4985]: I0224 10:10:58.112635 4985 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T10:10:58Z","lastTransitionTime":"2026-02-24T10:10:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 10:10:58 crc kubenswrapper[4985]: E0224 10:10:58.125496 4985 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T10:10:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:58Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T10:10:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:58Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T10:10:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:58Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T10:10:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:58Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"6fa96d71-6980-4f45-b02a-32602082091f\\\",\\\"systemUUID\\\":\\\"77848161-2cfb-4f0a-8e60-0ac9426710fd\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:10:58Z is after 2025-08-24T17:21:41Z" Feb 24 10:10:58 crc kubenswrapper[4985]: E0224 10:10:58.125626 4985 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Feb 24 10:10:59 crc kubenswrapper[4985]: I0224 10:10:59.264484 4985 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 24 10:10:59 crc kubenswrapper[4985]: I0224 10:10:59.264554 4985 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xkc65" Feb 24 10:10:59 crc kubenswrapper[4985]: I0224 10:10:59.264504 4985 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 24 10:10:59 crc kubenswrapper[4985]: I0224 10:10:59.264483 4985 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 24 10:10:59 crc kubenswrapper[4985]: E0224 10:10:59.264689 4985 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 24 10:10:59 crc kubenswrapper[4985]: E0224 10:10:59.264810 4985 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 24 10:10:59 crc kubenswrapper[4985]: E0224 10:10:59.265018 4985 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xkc65" podUID="d4340d1a-60cb-4240-87ba-1e468c9c41cf" Feb 24 10:10:59 crc kubenswrapper[4985]: E0224 10:10:59.265094 4985 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 24 10:11:00 crc kubenswrapper[4985]: I0224 10:11:00.940092 4985 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-q24bf_731349d2-7b07-4bc9-81f8-c7d75bca842a/kube-multus/0.log" Feb 24 10:11:00 crc kubenswrapper[4985]: I0224 10:11:00.940190 4985 generic.go:334] "Generic (PLEG): container finished" podID="731349d2-7b07-4bc9-81f8-c7d75bca842a" containerID="85e965ad144b3440864b3c9de70aacd2a2e49c715dca204b0e4cfae65954cd9e" exitCode=1 Feb 24 10:11:00 crc kubenswrapper[4985]: I0224 10:11:00.940252 4985 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-q24bf" event={"ID":"731349d2-7b07-4bc9-81f8-c7d75bca842a","Type":"ContainerDied","Data":"85e965ad144b3440864b3c9de70aacd2a2e49c715dca204b0e4cfae65954cd9e"} Feb 24 10:11:00 crc kubenswrapper[4985]: I0224 10:11:00.940953 4985 scope.go:117] "RemoveContainer" containerID="85e965ad144b3440864b3c9de70aacd2a2e49c715dca204b0e4cfae65954cd9e" Feb 24 10:11:00 crc kubenswrapper[4985]: I0224 10:11:00.961143 4985 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f25863d6-c7fb-47ed-98bf-1e51a5d31280\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:08:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:08:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:09:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:09:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:08:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a122488c66088f525d614bb431c3211b6c4cc2e607703b5b4ab70ed998df6bfd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://80172a23d36c0c1ea92ff824608237fed22c6e46e43ac33db31a9aff8d181b2b\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-24T10:09:09Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0224 10:08:38.386655 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0224 10:08:38.388712 1 observer_polling.go:159] Starting file observer\\\\nI0224 10:08:38.420524 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0224 10:08:38.423674 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nF0224 10:09:09.020819 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-24T10:08:37Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:09:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://171f1effb015bd364e8afd406c496b6312f61c665b54b95b11fb5b203fca4c7e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:08:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://386ac8359b4df2fa31547b13090041850a839a61e549ff58ef8d5c33476e6e14\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:08:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d21b46ed1562aa1c67d31b5ed21686d633a5c8758c65d02566833c8c2b34d6fa\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:08:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:08:36Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:11:00Z is after 2025-08-24T17:21:41Z" Feb 24 10:11:00 crc kubenswrapper[4985]: I0224 10:11:00.985561 4985 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:11:00Z is after 2025-08-24T17:21:41Z" Feb 24 10:11:01 crc kubenswrapper[4985]: I0224 10:11:01.004319 4985 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:11:01Z is after 2025-08-24T17:21:41Z" Feb 24 10:11:01 crc kubenswrapper[4985]: I0224 10:11:01.022023 4985 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-q24bf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"731349d2-7b07-4bc9-81f8-c7d75bca842a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:11:00Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:11:00Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://85e965ad144b3440864b3c9de70aacd2a2e49c715dca204b0e4cfae65954cd9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://85e965ad144b3440864b3c9de70aacd2a2e49c715dca204b0e4cfae65954cd9e\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-24T10:11:00Z\\\",\\\"message\\\":\\\"2026-02-24T10:10:15+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_482b94f0-6a0d-4f69-917f-1ca7bdfc7bb0\\\\n2026-02-24T10:10:15+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_482b94f0-6a0d-4f69-917f-1ca7bdfc7bb0 to /host/opt/cni/bin/\\\\n2026-02-24T10:10:15Z [verbose] multus-daemon started\\\\n2026-02-24T10:10:15Z [verbose] Readiness Indicator file check\\\\n2026-02-24T10:11:00Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-24T10:10:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5g9sh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:10:13Z\\\"}}\" for pod \"openshift-multus\"/\"multus-q24bf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:11:01Z is after 2025-08-24T17:21:41Z" Feb 24 10:11:01 crc kubenswrapper[4985]: I0224 10:11:01.037771 4985 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-kkmm8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c3a34c00-910b-400b-bc96-9d805e076b7c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://65682d7b02ccbc0c344ab0b8eaf55d227d0b6e4a04e56d1038d51aa30747b068\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:10:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2zg8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://52f2dbaf1dfa5fd40bbf8a0d82613a96b05afccc71653f6d1f24db6674950b58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:10:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2zg8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:10:13Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-kkmm8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:11:01Z is after 2025-08-24T17:21:41Z" Feb 24 10:11:01 crc kubenswrapper[4985]: I0224 10:11:01.061618 4985 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a6395bdd-0d8e-4572-ae86-695e87aff12e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:08:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:08:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:08:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f9134e7cde3d6c701f928934ed4de91ee13bac8bc62dde5df6fe535a219fb57\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:08:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba53e3e1dfd92b0155e60254f21e53e06da9929560934d7bd95624f4a2bb619c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:08:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d059ea80117284da8c7eb82e4cd878eabba8f9d96102965708b9f6e0f8e96eb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:08:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dea2225b9a0112117f2f0bd6038ef636ac20e0cfcf3d608b720ad7aa2cacfa7c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6ee8b871ab25b79ee67f4e5735673469d19cfb336696170957d2439b0bca13af\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-24T10:09:32Z\\\",\\\"message\\\":\\\"le observer\\\\nW0224 10:09:31.800132 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0224 10:09:31.800281 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0224 10:09:31.801147 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1923306337/tls.crt::/tmp/serving-cert-1923306337/tls.key\\\\\\\"\\\\nI0224 10:09:31.990905 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0224 10:09:31.993717 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0224 10:09:31.993747 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0224 10:09:31.993768 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0224 10:09:31.993775 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0224 10:09:31.997392 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0224 10:09:31.997421 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0224 10:09:31.997427 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0224 10:09:31.997436 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nI0224 10:09:31.997435 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0224 10:09:31.997440 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0224 10:09:31.997467 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0224 10:09:31.997470 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0224 10:09:31.999573 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-24T10:09:31Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:10:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5cf6a4886c046added0a4380f1ca023def94c15e9d5c0ed472c714632684f158\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:08:39Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8da8e0d5478ae714a89c61cf4ca4142bc433bca4d4e4749f069c706abfed8c8f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8da8e0d5478ae714a89c61cf4ca4142bc433bca4d4e4749f069c706abfed8c8f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T10:08:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T10:08:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:08:36Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:11:01Z is after 2025-08-24T17:21:41Z" Feb 24 10:11:01 crc kubenswrapper[4985]: I0224 10:11:01.077330 4985 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-xj9h5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c4bc690c-38e6-488d-97b2-9bf37a916fe4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ab4787f21b4cef49e96547a1d1e7105c79d01d7863afd7ea013f9a38cb40dd06\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:10:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svj5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5aa381e3ea366641ef4fa423fdd46f8afb44b1b2a2ccb6c754007456149b75e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5aa381e3ea366641ef4fa423fdd46f8afb44b1b2a2ccb6c754007456149b75e6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T10:10:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svj5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e071fe863c4e9eb6ebecb3d58d1c4a3c66577cbe46568a4d80a5de5663636539\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e071fe863c4e9eb6ebecb3d58d1c4a3c66577cbe46568a4d80a5de5663636539\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T10:10:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T10:10:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svj5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ac924a3bb0825085e397dbc788641894675e9556fd8aeadcbf9a24a65e1a076\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4ac924a3bb0825085e397dbc788641894675e9556fd8aeadcbf9a24a65e1a076\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T10:10:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T10:10:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svj5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://91acce3dbd19205b8faf23208b7d65c0c47d8156eaf6d7d3357ce1ca64a3bb88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://91acce3dbd19205b8faf23208b7d65c0c47d8156eaf6d7d3357ce1ca64a3bb88\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T10:10:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T10:10:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svj5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://81482396eb82b6ef2f09d03cf5e6bc785209394a50c63b199581fc70120bb9fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://81482396eb82b6ef2f09d03cf5e6bc785209394a50c63b199581fc70120bb9fd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T10:10:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T10:10:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svj5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://556b84b0e9e72b0c26c58defa2c030943daffa83d97b60293cc5b5627a6c9174\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://556b84b0e9e72b0c26c58defa2c030943daffa83d97b60293cc5b5627a6c9174\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T10:10:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T10:10:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svj5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:10:13Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-xj9h5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:11:01Z is after 2025-08-24T17:21:41Z" Feb 24 10:11:01 crc kubenswrapper[4985]: I0224 10:11:01.090743 4985 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-hq52w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"11c1c7b8-18df-4583-849f-76b62544344b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8546cf975b145528b7f94bd03c6570de0db1a059477da5795b7569250bde3ca5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:10:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kcfbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://14cb7316040a9cb3cb3236ed16ecc17eee99c6935a646b86604ea8285e1aab0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:10:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kcfbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:10:13Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-hq52w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:11:01Z is after 2025-08-24T17:21:41Z" Feb 24 10:11:01 crc kubenswrapper[4985]: I0224 10:11:01.102281 4985 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-jj7jq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4776cd42-9a5c-4c2e-a585-a1f4a49d2d6d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6f45d6400640ba9b59793abc9aee79bb1c32439ff377abdf8b4ec4d1cd7c336a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:10:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rvmf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:10:13Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-jj7jq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:11:01Z is after 2025-08-24T17:21:41Z" Feb 24 10:11:01 crc kubenswrapper[4985]: I0224 10:11:01.115479 4985 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:11:01Z is after 2025-08-24T17:21:41Z" Feb 24 10:11:01 crc kubenswrapper[4985]: I0224 10:11:01.130678 4985 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://137c9b42aa2109e16d0e273e3e7d116db30ed716c1e6f78fb8f4e87409e3f252\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:10:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://40736fa6ee59064a7fc5bed9b08671d094e7a11b6a8bf4ec45220a0d62156542\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:10:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:11:01Z is after 2025-08-24T17:21:41Z" Feb 24 10:11:01 crc kubenswrapper[4985]: I0224 10:11:01.145569 4985 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"71a2eb97-3905-45f5-8448-ab8a6e27e0fb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:08:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:08:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:09:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:09:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:08:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://30ed8363bc5fff9f3b57a12ef0133eae4aa1447b427d1fc98e49e4155a154dab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:08:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://60b8a045a92dbfe12e3eed0e5ec7fda3284a16cb5a1ab0296ed1bd44c427f4b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:08:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6163846a3f1d508e3ea6bad31b1818569d4339c10d24284aa2360a98f4f78351\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:08:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6388ea35afa9218a9ebe327d6c6a9b35c7b92bd77a1f95e4e569ba759ec9aeb2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6388ea35afa9218a9ebe327d6c6a9b35c7b92bd77a1f95e4e569ba759ec9aeb2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T10:08:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T10:08:37Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:08:36Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:11:01Z is after 2025-08-24T17:21:41Z" Feb 24 10:11:01 crc kubenswrapper[4985]: I0224 10:11:01.172590 4985 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"42276a15-2a73-4687-91ad-d5ff65f8526d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:08:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:08:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:08:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:08:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:08:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://239bae44bdfe47737a87a7c7ffe103f138acfb89e1eb16c3901b25e8a0fa12e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:08:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aeb360380a66a7d4923e8db278af3a2024d5ed789e85e1427aaad600c2754224\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:08:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7968a5b1a705405cdca66b1c914475bd58b8c511980dfe312abd7ddf447d6375\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:08:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://813f41fed199f25206b4e475a175beaa3f3223c8b4f946e533f4290f64dbd069\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:08:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2a1fe7d1c64e5170ec0ce50343eb2a81b002e5b8bb4160c5ab4abafcc4ece67b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:08:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f5db1ab98b50bb081b07ffa0a8660ef886047d941d9b8509f484011950d0a94a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f5db1ab98b50bb081b07ffa0a8660ef886047d941d9b8509f484011950d0a94a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T10:08:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T10:08:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9edf4a1146efaf26193d3f8c63eb95cd945e7be5d36dc825d24053417ac5b1ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9edf4a1146efaf26193d3f8c63eb95cd945e7be5d36dc825d24053417ac5b1ac\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T10:08:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T10:08:38Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://2592e50d0dc75b9c89a94486e27887c6487e21a72ad53055b1b9af0cdfd73ec9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2592e50d0dc75b9c89a94486e27887c6487e21a72ad53055b1b9af0cdfd73ec9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T10:08:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T10:08:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:08:36Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:11:01Z is after 2025-08-24T17:21:41Z" Feb 24 10:11:01 crc kubenswrapper[4985]: I0224 10:11:01.187674 4985 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c9e0873cd1521fe3cd14005f22118db4616043670f4b895435e71ddec9d82a0e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:10:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:11:01Z is after 2025-08-24T17:21:41Z" Feb 24 10:11:01 crc kubenswrapper[4985]: I0224 10:11:01.200293 4985 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:16Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:16Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e2f9d78f6a52c55d1f70fa119f84fdb11505d511d394a8ed61a298e79dfb2dcd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:10:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:11:01Z is after 2025-08-24T17:21:41Z" Feb 24 10:11:01 crc kubenswrapper[4985]: I0224 10:11:01.214232 4985 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-xkc65" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d4340d1a-60cb-4240-87ba-1e468c9c41cf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lvcfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lvcfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:10:13Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-xkc65\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:11:01Z is after 2025-08-24T17:21:41Z" Feb 24 10:11:01 crc kubenswrapper[4985]: I0224 10:11:01.235472 4985 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-27dpt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1b3986ef-e9be-43db-9350-ccc7dd3f713f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://97f91c2ddd00ceb58ea5691c7cb057bd7cfe8add5fd1dfd334f39b9def30db0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:10:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9n6c9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://af211a7b5dbd1e1f86f340d63afe03ad2f9691d166810837cc5988a24ad4f323\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:10:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9n6c9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3d76cc1e57df7809a67ce69b6690fd8cbe0ef9c364406b03bbb37bd110d65866\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:10:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9n6c9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cb7b5b1f27320e4bc00e48154c3bb41b1153e36c39a404cfc1d4a246f5909c5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:10:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9n6c9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e04b766ff2b4eb366ace7cbf978ef84bfa922494ec3d28f4cbd75efe1cb54f1e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:10:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9n6c9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7630d2e046fcee78924010a8de1aefcdf7c24ef4e70c4ea89c193983a0d88a61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:10:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9n6c9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0aec8312b8a8ac9d698c0e1253f847faf52c391d69d8276e73b1e774a6819807\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0aec8312b8a8ac9d698c0e1253f847faf52c391d69d8276e73b1e774a6819807\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-24T10:10:41Z\\\",\\\"message\\\":\\\"ft-etcd/etcd-crc after 0 failed attempt(s)\\\\nI0224 10:10:41.298714 7041 default_network_controller.go:776] Recording success event on pod openshift-etcd/etcd-crc\\\\nI0224 10:10:41.298592 7041 ovnkube.go:599] Stopped ovnkube\\\\nI0224 10:10:41.298707 7041 transact.go:42] Configuring OVN: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-marketplace/certified-operators]} name:Service_openshift-marketplace/certified-operators_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.214:50051:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {20da2226-531c-4179-9810-aa4026995ca3}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0224 10:10:41.298743 7041 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI0224 10:10:41.298781 7041 loadbalancer.go:304] Deleted 0 stale LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-kube-controller-manager/kube-controller-manager\\\\\\\"}\\\\nF0224 10:10:41.298810 7041 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-24T10:10:40Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-27dpt_openshift-ovn-kubernetes(1b3986ef-e9be-43db-9350-ccc7dd3f713f)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9n6c9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e3faa6c1361aca1d102b6dbfaf38fa3094a5c2334713ce70a8bf2f58f19d6392\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:10:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9n6c9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8de70126733b3634f4e3c1576c3095c089e9628a6147ce8ea79fa5242a2eb920\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8de70126733b3634f4e3c1576c3095c089e9628a6147ce8ea79fa5242a2eb920\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T10:10:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9n6c9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:10:13Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-27dpt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:11:01Z is after 2025-08-24T17:21:41Z" Feb 24 10:11:01 crc kubenswrapper[4985]: I0224 10:11:01.249639 4985 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-qtqms" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c43e3252-1b22-48a0-8895-ad98fc88b7cf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://778506fd4d9af7b6ebd4e4d46282938146ba0184d9747ca657957c78118c10e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:10:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jdldr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:10:13Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-qtqms\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:11:01Z is after 2025-08-24T17:21:41Z" Feb 24 10:11:01 crc kubenswrapper[4985]: I0224 10:11:01.263967 4985 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 24 10:11:01 crc kubenswrapper[4985]: I0224 10:11:01.264001 4985 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 24 10:11:01 crc kubenswrapper[4985]: I0224 10:11:01.264014 4985 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xkc65" Feb 24 10:11:01 crc kubenswrapper[4985]: I0224 10:11:01.263968 4985 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 24 10:11:01 crc kubenswrapper[4985]: E0224 10:11:01.264113 4985 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 24 10:11:01 crc kubenswrapper[4985]: E0224 10:11:01.264199 4985 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 24 10:11:01 crc kubenswrapper[4985]: E0224 10:11:01.264373 4985 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xkc65" podUID="d4340d1a-60cb-4240-87ba-1e468c9c41cf" Feb 24 10:11:01 crc kubenswrapper[4985]: E0224 10:11:01.264572 4985 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 24 10:11:01 crc kubenswrapper[4985]: E0224 10:11:01.369167 4985 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 24 10:11:01 crc kubenswrapper[4985]: I0224 10:11:01.946752 4985 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-q24bf_731349d2-7b07-4bc9-81f8-c7d75bca842a/kube-multus/0.log" Feb 24 10:11:01 crc kubenswrapper[4985]: I0224 10:11:01.946821 4985 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-q24bf" event={"ID":"731349d2-7b07-4bc9-81f8-c7d75bca842a","Type":"ContainerStarted","Data":"9ab2ed5b6b7b76ded6468be8fb5375bccf9f79a9b916fd1cc4fc2bc192140eb4"} Feb 24 10:11:01 crc kubenswrapper[4985]: I0224 10:11:01.966857 4985 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-jj7jq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4776cd42-9a5c-4c2e-a585-a1f4a49d2d6d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6f45d6400640ba9b59793abc9aee79bb1c32439ff377abdf8b4ec4d1cd7c336a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:10:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rvmf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:10:13Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-jj7jq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:11:01Z is after 2025-08-24T17:21:41Z" Feb 24 10:11:01 crc kubenswrapper[4985]: I0224 10:11:01.986268 4985 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a6395bdd-0d8e-4572-ae86-695e87aff12e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:08:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:08:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:08:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f9134e7cde3d6c701f928934ed4de91ee13bac8bc62dde5df6fe535a219fb57\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:08:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba53e3e1dfd92b0155e60254f21e53e06da9929560934d7bd95624f4a2bb619c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:08:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d059ea80117284da8c7eb82e4cd878eabba8f9d96102965708b9f6e0f8e96eb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:08:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dea2225b9a0112117f2f0bd6038ef636ac20e0cfcf3d608b720ad7aa2cacfa7c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6ee8b871ab25b79ee67f4e5735673469d19cfb336696170957d2439b0bca13af\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-24T10:09:32Z\\\",\\\"message\\\":\\\"le observer\\\\nW0224 10:09:31.800132 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0224 10:09:31.800281 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0224 10:09:31.801147 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1923306337/tls.crt::/tmp/serving-cert-1923306337/tls.key\\\\\\\"\\\\nI0224 10:09:31.990905 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0224 10:09:31.993717 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0224 10:09:31.993747 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0224 10:09:31.993768 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0224 10:09:31.993775 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0224 10:09:31.997392 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0224 10:09:31.997421 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0224 10:09:31.997427 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0224 10:09:31.997436 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nI0224 10:09:31.997435 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0224 10:09:31.997440 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0224 10:09:31.997467 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0224 10:09:31.997470 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0224 10:09:31.999573 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-24T10:09:31Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:10:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5cf6a4886c046added0a4380f1ca023def94c15e9d5c0ed472c714632684f158\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:08:39Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8da8e0d5478ae714a89c61cf4ca4142bc433bca4d4e4749f069c706abfed8c8f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8da8e0d5478ae714a89c61cf4ca4142bc433bca4d4e4749f069c706abfed8c8f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T10:08:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T10:08:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:08:36Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:11:01Z is after 2025-08-24T17:21:41Z" Feb 24 10:11:02 crc kubenswrapper[4985]: I0224 10:11:02.003603 4985 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-xj9h5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c4bc690c-38e6-488d-97b2-9bf37a916fe4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ab4787f21b4cef49e96547a1d1e7105c79d01d7863afd7ea013f9a38cb40dd06\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:10:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svj5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5aa381e3ea366641ef4fa423fdd46f8afb44b1b2a2ccb6c754007456149b75e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5aa381e3ea366641ef4fa423fdd46f8afb44b1b2a2ccb6c754007456149b75e6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T10:10:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svj5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e071fe863c4e9eb6ebecb3d58d1c4a3c66577cbe46568a4d80a5de5663636539\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e071fe863c4e9eb6ebecb3d58d1c4a3c66577cbe46568a4d80a5de5663636539\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T10:10:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T10:10:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svj5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ac924a3bb0825085e397dbc788641894675e9556fd8aeadcbf9a24a65e1a076\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4ac924a3bb0825085e397dbc788641894675e9556fd8aeadcbf9a24a65e1a076\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T10:10:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T10:10:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svj5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://91acce3dbd19205b8faf23208b7d65c0c47d8156eaf6d7d3357ce1ca64a3bb88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://91acce3dbd19205b8faf23208b7d65c0c47d8156eaf6d7d3357ce1ca64a3bb88\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T10:10:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T10:10:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svj5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://81482396eb82b6ef2f09d03cf5e6bc785209394a50c63b199581fc70120bb9fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://81482396eb82b6ef2f09d03cf5e6bc785209394a50c63b199581fc70120bb9fd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T10:10:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T10:10:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svj5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://556b84b0e9e72b0c26c58defa2c030943daffa83d97b60293cc5b5627a6c9174\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://556b84b0e9e72b0c26c58defa2c030943daffa83d97b60293cc5b5627a6c9174\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T10:10:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T10:10:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svj5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:10:13Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-xj9h5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:11:02Z is after 2025-08-24T17:21:41Z" Feb 24 10:11:02 crc kubenswrapper[4985]: I0224 10:11:02.021733 4985 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-hq52w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"11c1c7b8-18df-4583-849f-76b62544344b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8546cf975b145528b7f94bd03c6570de0db1a059477da5795b7569250bde3ca5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:10:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kcfbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://14cb7316040a9cb3cb3236ed16ecc17eee99c6935a646b86604ea8285e1aab0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:10:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kcfbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:10:13Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-hq52w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:11:02Z is after 2025-08-24T17:21:41Z" Feb 24 10:11:02 crc kubenswrapper[4985]: I0224 10:11:02.035406 4985 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:11:02Z is after 2025-08-24T17:21:41Z" Feb 24 10:11:02 crc kubenswrapper[4985]: I0224 10:11:02.053968 4985 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://137c9b42aa2109e16d0e273e3e7d116db30ed716c1e6f78fb8f4e87409e3f252\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:10:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://40736fa6ee59064a7fc5bed9b08671d094e7a11b6a8bf4ec45220a0d62156542\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:10:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:11:02Z is after 2025-08-24T17:21:41Z" Feb 24 10:11:02 crc kubenswrapper[4985]: I0224 10:11:02.073572 4985 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c9e0873cd1521fe3cd14005f22118db4616043670f4b895435e71ddec9d82a0e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:10:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:11:02Z is after 2025-08-24T17:21:41Z" Feb 24 10:11:02 crc kubenswrapper[4985]: I0224 10:11:02.087076 4985 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:16Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:16Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e2f9d78f6a52c55d1f70fa119f84fdb11505d511d394a8ed61a298e79dfb2dcd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:10:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:11:02Z is after 2025-08-24T17:21:41Z" Feb 24 10:11:02 crc kubenswrapper[4985]: I0224 10:11:02.099919 4985 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-xkc65" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d4340d1a-60cb-4240-87ba-1e468c9c41cf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lvcfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lvcfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:10:13Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-xkc65\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:11:02Z is after 2025-08-24T17:21:41Z" Feb 24 10:11:02 crc kubenswrapper[4985]: I0224 10:11:02.123855 4985 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-27dpt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1b3986ef-e9be-43db-9350-ccc7dd3f713f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://97f91c2ddd00ceb58ea5691c7cb057bd7cfe8add5fd1dfd334f39b9def30db0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:10:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9n6c9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://af211a7b5dbd1e1f86f340d63afe03ad2f9691d166810837cc5988a24ad4f323\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:10:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9n6c9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3d76cc1e57df7809a67ce69b6690fd8cbe0ef9c364406b03bbb37bd110d65866\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:10:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9n6c9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cb7b5b1f27320e4bc00e48154c3bb41b1153e36c39a404cfc1d4a246f5909c5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:10:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9n6c9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e04b766ff2b4eb366ace7cbf978ef84bfa922494ec3d28f4cbd75efe1cb54f1e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:10:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9n6c9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7630d2e046fcee78924010a8de1aefcdf7c24ef4e70c4ea89c193983a0d88a61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:10:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9n6c9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0aec8312b8a8ac9d698c0e1253f847faf52c391d69d8276e73b1e774a6819807\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0aec8312b8a8ac9d698c0e1253f847faf52c391d69d8276e73b1e774a6819807\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-24T10:10:41Z\\\",\\\"message\\\":\\\"ft-etcd/etcd-crc after 0 failed attempt(s)\\\\nI0224 10:10:41.298714 7041 default_network_controller.go:776] Recording success event on pod openshift-etcd/etcd-crc\\\\nI0224 10:10:41.298592 7041 ovnkube.go:599] Stopped ovnkube\\\\nI0224 10:10:41.298707 7041 transact.go:42] Configuring OVN: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-marketplace/certified-operators]} name:Service_openshift-marketplace/certified-operators_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.214:50051:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {20da2226-531c-4179-9810-aa4026995ca3}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0224 10:10:41.298743 7041 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI0224 10:10:41.298781 7041 loadbalancer.go:304] Deleted 0 stale LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-kube-controller-manager/kube-controller-manager\\\\\\\"}\\\\nF0224 10:10:41.298810 7041 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-24T10:10:40Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-27dpt_openshift-ovn-kubernetes(1b3986ef-e9be-43db-9350-ccc7dd3f713f)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9n6c9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e3faa6c1361aca1d102b6dbfaf38fa3094a5c2334713ce70a8bf2f58f19d6392\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:10:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9n6c9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8de70126733b3634f4e3c1576c3095c089e9628a6147ce8ea79fa5242a2eb920\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8de70126733b3634f4e3c1576c3095c089e9628a6147ce8ea79fa5242a2eb920\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T10:10:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9n6c9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:10:13Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-27dpt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:11:02Z is after 2025-08-24T17:21:41Z" Feb 24 10:11:02 crc kubenswrapper[4985]: I0224 10:11:02.138239 4985 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-qtqms" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c43e3252-1b22-48a0-8895-ad98fc88b7cf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://778506fd4d9af7b6ebd4e4d46282938146ba0184d9747ca657957c78118c10e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:10:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jdldr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:10:13Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-qtqms\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:11:02Z is after 2025-08-24T17:21:41Z" Feb 24 10:11:02 crc kubenswrapper[4985]: I0224 10:11:02.156329 4985 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"71a2eb97-3905-45f5-8448-ab8a6e27e0fb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:08:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:08:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:09:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:09:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:08:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://30ed8363bc5fff9f3b57a12ef0133eae4aa1447b427d1fc98e49e4155a154dab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:08:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://60b8a045a92dbfe12e3eed0e5ec7fda3284a16cb5a1ab0296ed1bd44c427f4b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:08:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6163846a3f1d508e3ea6bad31b1818569d4339c10d24284aa2360a98f4f78351\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:08:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6388ea35afa9218a9ebe327d6c6a9b35c7b92bd77a1f95e4e569ba759ec9aeb2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6388ea35afa9218a9ebe327d6c6a9b35c7b92bd77a1f95e4e569ba759ec9aeb2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T10:08:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T10:08:37Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:08:36Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:11:02Z is after 2025-08-24T17:21:41Z" Feb 24 10:11:02 crc kubenswrapper[4985]: I0224 10:11:02.179724 4985 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"42276a15-2a73-4687-91ad-d5ff65f8526d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:08:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:08:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:08:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:08:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:08:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://239bae44bdfe47737a87a7c7ffe103f138acfb89e1eb16c3901b25e8a0fa12e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:08:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aeb360380a66a7d4923e8db278af3a2024d5ed789e85e1427aaad600c2754224\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:08:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7968a5b1a705405cdca66b1c914475bd58b8c511980dfe312abd7ddf447d6375\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:08:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://813f41fed199f25206b4e475a175beaa3f3223c8b4f946e533f4290f64dbd069\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:08:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2a1fe7d1c64e5170ec0ce50343eb2a81b002e5b8bb4160c5ab4abafcc4ece67b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:08:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f5db1ab98b50bb081b07ffa0a8660ef886047d941d9b8509f484011950d0a94a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f5db1ab98b50bb081b07ffa0a8660ef886047d941d9b8509f484011950d0a94a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T10:08:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T10:08:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9edf4a1146efaf26193d3f8c63eb95cd945e7be5d36dc825d24053417ac5b1ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9edf4a1146efaf26193d3f8c63eb95cd945e7be5d36dc825d24053417ac5b1ac\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T10:08:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T10:08:38Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://2592e50d0dc75b9c89a94486e27887c6487e21a72ad53055b1b9af0cdfd73ec9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2592e50d0dc75b9c89a94486e27887c6487e21a72ad53055b1b9af0cdfd73ec9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T10:08:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T10:08:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:08:36Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:11:02Z is after 2025-08-24T17:21:41Z" Feb 24 10:11:02 crc kubenswrapper[4985]: I0224 10:11:02.194235 4985 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-q24bf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"731349d2-7b07-4bc9-81f8-c7d75bca842a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:11:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:11:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9ab2ed5b6b7b76ded6468be8fb5375bccf9f79a9b916fd1cc4fc2bc192140eb4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://85e965ad144b3440864b3c9de70aacd2a2e49c715dca204b0e4cfae65954cd9e\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-24T10:11:00Z\\\",\\\"message\\\":\\\"2026-02-24T10:10:15+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_482b94f0-6a0d-4f69-917f-1ca7bdfc7bb0\\\\n2026-02-24T10:10:15+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_482b94f0-6a0d-4f69-917f-1ca7bdfc7bb0 to /host/opt/cni/bin/\\\\n2026-02-24T10:10:15Z [verbose] multus-daemon started\\\\n2026-02-24T10:10:15Z [verbose] Readiness Indicator file check\\\\n2026-02-24T10:11:00Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-24T10:10:13Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:11:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5g9sh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:10:13Z\\\"}}\" for pod \"openshift-multus\"/\"multus-q24bf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:11:02Z is after 2025-08-24T17:21:41Z" Feb 24 10:11:02 crc kubenswrapper[4985]: I0224 10:11:02.206494 4985 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-kkmm8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c3a34c00-910b-400b-bc96-9d805e076b7c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://65682d7b02ccbc0c344ab0b8eaf55d227d0b6e4a04e56d1038d51aa30747b068\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:10:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2zg8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://52f2dbaf1dfa5fd40bbf8a0d82613a96b05afccc71653f6d1f24db6674950b58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:10:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2zg8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:10:13Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-kkmm8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:11:02Z is after 2025-08-24T17:21:41Z" Feb 24 10:11:02 crc kubenswrapper[4985]: I0224 10:11:02.220609 4985 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f25863d6-c7fb-47ed-98bf-1e51a5d31280\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:08:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:08:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:09:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:09:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:08:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a122488c66088f525d614bb431c3211b6c4cc2e607703b5b4ab70ed998df6bfd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://80172a23d36c0c1ea92ff824608237fed22c6e46e43ac33db31a9aff8d181b2b\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-24T10:09:09Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0224 10:08:38.386655 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0224 10:08:38.388712 1 observer_polling.go:159] Starting file observer\\\\nI0224 10:08:38.420524 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0224 10:08:38.423674 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nF0224 10:09:09.020819 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-24T10:08:37Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:09:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://171f1effb015bd364e8afd406c496b6312f61c665b54b95b11fb5b203fca4c7e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:08:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://386ac8359b4df2fa31547b13090041850a839a61e549ff58ef8d5c33476e6e14\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:08:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d21b46ed1562aa1c67d31b5ed21686d633a5c8758c65d02566833c8c2b34d6fa\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:08:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:08:36Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:11:02Z is after 2025-08-24T17:21:41Z" Feb 24 10:11:02 crc kubenswrapper[4985]: I0224 10:11:02.234065 4985 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:11:02Z is after 2025-08-24T17:21:41Z" Feb 24 10:11:02 crc kubenswrapper[4985]: I0224 10:11:02.250173 4985 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:11:02Z is after 2025-08-24T17:21:41Z" Feb 24 10:11:03 crc kubenswrapper[4985]: I0224 10:11:03.264259 4985 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 24 10:11:03 crc kubenswrapper[4985]: E0224 10:11:03.264737 4985 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 24 10:11:03 crc kubenswrapper[4985]: I0224 10:11:03.264264 4985 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 24 10:11:03 crc kubenswrapper[4985]: I0224 10:11:03.264385 4985 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 24 10:11:03 crc kubenswrapper[4985]: E0224 10:11:03.265014 4985 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 24 10:11:03 crc kubenswrapper[4985]: I0224 10:11:03.264329 4985 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xkc65" Feb 24 10:11:03 crc kubenswrapper[4985]: E0224 10:11:03.265171 4985 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 24 10:11:03 crc kubenswrapper[4985]: E0224 10:11:03.265284 4985 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xkc65" podUID="d4340d1a-60cb-4240-87ba-1e468c9c41cf" Feb 24 10:11:05 crc kubenswrapper[4985]: I0224 10:11:05.263668 4985 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 24 10:11:05 crc kubenswrapper[4985]: I0224 10:11:05.263746 4985 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 24 10:11:05 crc kubenswrapper[4985]: I0224 10:11:05.263669 4985 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xkc65" Feb 24 10:11:05 crc kubenswrapper[4985]: E0224 10:11:05.263850 4985 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 24 10:11:05 crc kubenswrapper[4985]: E0224 10:11:05.263954 4985 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xkc65" podUID="d4340d1a-60cb-4240-87ba-1e468c9c41cf" Feb 24 10:11:05 crc kubenswrapper[4985]: I0224 10:11:05.263740 4985 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 24 10:11:05 crc kubenswrapper[4985]: E0224 10:11:05.264132 4985 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 24 10:11:05 crc kubenswrapper[4985]: E0224 10:11:05.264283 4985 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 24 10:11:06 crc kubenswrapper[4985]: I0224 10:11:06.278746 4985 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c9e0873cd1521fe3cd14005f22118db4616043670f4b895435e71ddec9d82a0e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:10:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:11:06Z is after 2025-08-24T17:21:41Z" Feb 24 10:11:06 crc kubenswrapper[4985]: I0224 10:11:06.295471 4985 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:16Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:16Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e2f9d78f6a52c55d1f70fa119f84fdb11505d511d394a8ed61a298e79dfb2dcd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:10:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:11:06Z is after 2025-08-24T17:21:41Z" Feb 24 10:11:06 crc kubenswrapper[4985]: I0224 10:11:06.306376 4985 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-xkc65" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d4340d1a-60cb-4240-87ba-1e468c9c41cf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lvcfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lvcfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:10:13Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-xkc65\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:11:06Z is after 2025-08-24T17:21:41Z" Feb 24 10:11:06 crc kubenswrapper[4985]: I0224 10:11:06.336071 4985 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-27dpt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1b3986ef-e9be-43db-9350-ccc7dd3f713f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://97f91c2ddd00ceb58ea5691c7cb057bd7cfe8add5fd1dfd334f39b9def30db0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:10:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9n6c9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://af211a7b5dbd1e1f86f340d63afe03ad2f9691d166810837cc5988a24ad4f323\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:10:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9n6c9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3d76cc1e57df7809a67ce69b6690fd8cbe0ef9c364406b03bbb37bd110d65866\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:10:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9n6c9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cb7b5b1f27320e4bc00e48154c3bb41b1153e36c39a404cfc1d4a246f5909c5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:10:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9n6c9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e04b766ff2b4eb366ace7cbf978ef84bfa922494ec3d28f4cbd75efe1cb54f1e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:10:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9n6c9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7630d2e046fcee78924010a8de1aefcdf7c24ef4e70c4ea89c193983a0d88a61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:10:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9n6c9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0aec8312b8a8ac9d698c0e1253f847faf52c391d69d8276e73b1e774a6819807\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0aec8312b8a8ac9d698c0e1253f847faf52c391d69d8276e73b1e774a6819807\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-24T10:10:41Z\\\",\\\"message\\\":\\\"ft-etcd/etcd-crc after 0 failed attempt(s)\\\\nI0224 10:10:41.298714 7041 default_network_controller.go:776] Recording success event on pod openshift-etcd/etcd-crc\\\\nI0224 10:10:41.298592 7041 ovnkube.go:599] Stopped ovnkube\\\\nI0224 10:10:41.298707 7041 transact.go:42] Configuring OVN: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-marketplace/certified-operators]} name:Service_openshift-marketplace/certified-operators_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.214:50051:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {20da2226-531c-4179-9810-aa4026995ca3}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0224 10:10:41.298743 7041 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI0224 10:10:41.298781 7041 loadbalancer.go:304] Deleted 0 stale LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-kube-controller-manager/kube-controller-manager\\\\\\\"}\\\\nF0224 10:10:41.298810 7041 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-24T10:10:40Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-27dpt_openshift-ovn-kubernetes(1b3986ef-e9be-43db-9350-ccc7dd3f713f)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9n6c9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e3faa6c1361aca1d102b6dbfaf38fa3094a5c2334713ce70a8bf2f58f19d6392\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:10:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9n6c9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8de70126733b3634f4e3c1576c3095c089e9628a6147ce8ea79fa5242a2eb920\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8de70126733b3634f4e3c1576c3095c089e9628a6147ce8ea79fa5242a2eb920\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T10:10:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9n6c9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:10:13Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-27dpt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:11:06Z is after 2025-08-24T17:21:41Z" Feb 24 10:11:06 crc kubenswrapper[4985]: I0224 10:11:06.351923 4985 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-qtqms" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c43e3252-1b22-48a0-8895-ad98fc88b7cf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://778506fd4d9af7b6ebd4e4d46282938146ba0184d9747ca657957c78118c10e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:10:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jdldr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:10:13Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-qtqms\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:11:06Z is after 2025-08-24T17:21:41Z" Feb 24 10:11:06 crc kubenswrapper[4985]: E0224 10:11:06.374302 4985 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 24 10:11:06 crc kubenswrapper[4985]: I0224 10:11:06.387542 4985 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"71a2eb97-3905-45f5-8448-ab8a6e27e0fb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:08:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:08:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:09:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:09:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:08:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://30ed8363bc5fff9f3b57a12ef0133eae4aa1447b427d1fc98e49e4155a154dab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:08:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://60b8a045a92dbfe12e3eed0e5ec7fda3284a16cb5a1ab0296ed1bd44c427f4b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:08:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6163846a3f1d508e3ea6bad31b1818569d4339c10d24284aa2360a98f4f78351\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:08:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6388ea35afa9218a9ebe327d6c6a9b35c7b92bd77a1f95e4e569ba759ec9aeb2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6388ea35afa9218a9ebe327d6c6a9b35c7b92bd77a1f95e4e569ba759ec9aeb2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T10:08:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T10:08:37Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:08:36Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:11:06Z is after 2025-08-24T17:21:41Z" Feb 24 10:11:06 crc kubenswrapper[4985]: I0224 10:11:06.432077 4985 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"42276a15-2a73-4687-91ad-d5ff65f8526d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:08:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:08:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:08:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:08:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:08:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://239bae44bdfe47737a87a7c7ffe103f138acfb89e1eb16c3901b25e8a0fa12e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:08:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aeb360380a66a7d4923e8db278af3a2024d5ed789e85e1427aaad600c2754224\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:08:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7968a5b1a705405cdca66b1c914475bd58b8c511980dfe312abd7ddf447d6375\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:08:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://813f41fed199f25206b4e475a175beaa3f3223c8b4f946e533f4290f64dbd069\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:08:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2a1fe7d1c64e5170ec0ce50343eb2a81b002e5b8bb4160c5ab4abafcc4ece67b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:08:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f5db1ab98b50bb081b07ffa0a8660ef886047d941d9b8509f484011950d0a94a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f5db1ab98b50bb081b07ffa0a8660ef886047d941d9b8509f484011950d0a94a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T10:08:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T10:08:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9edf4a1146efaf26193d3f8c63eb95cd945e7be5d36dc825d24053417ac5b1ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9edf4a1146efaf26193d3f8c63eb95cd945e7be5d36dc825d24053417ac5b1ac\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T10:08:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T10:08:38Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://2592e50d0dc75b9c89a94486e27887c6487e21a72ad53055b1b9af0cdfd73ec9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2592e50d0dc75b9c89a94486e27887c6487e21a72ad53055b1b9af0cdfd73ec9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T10:08:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T10:08:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:08:36Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:11:06Z is after 2025-08-24T17:21:41Z" Feb 24 10:11:06 crc kubenswrapper[4985]: I0224 10:11:06.444727 4985 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-q24bf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"731349d2-7b07-4bc9-81f8-c7d75bca842a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:11:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:11:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9ab2ed5b6b7b76ded6468be8fb5375bccf9f79a9b916fd1cc4fc2bc192140eb4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://85e965ad144b3440864b3c9de70aacd2a2e49c715dca204b0e4cfae65954cd9e\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-24T10:11:00Z\\\",\\\"message\\\":\\\"2026-02-24T10:10:15+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_482b94f0-6a0d-4f69-917f-1ca7bdfc7bb0\\\\n2026-02-24T10:10:15+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_482b94f0-6a0d-4f69-917f-1ca7bdfc7bb0 to /host/opt/cni/bin/\\\\n2026-02-24T10:10:15Z [verbose] multus-daemon started\\\\n2026-02-24T10:10:15Z [verbose] Readiness Indicator file check\\\\n2026-02-24T10:11:00Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-24T10:10:13Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:11:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5g9sh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:10:13Z\\\"}}\" for pod \"openshift-multus\"/\"multus-q24bf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:11:06Z is after 2025-08-24T17:21:41Z" Feb 24 10:11:06 crc kubenswrapper[4985]: I0224 10:11:06.458384 4985 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-kkmm8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c3a34c00-910b-400b-bc96-9d805e076b7c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://65682d7b02ccbc0c344ab0b8eaf55d227d0b6e4a04e56d1038d51aa30747b068\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:10:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2zg8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://52f2dbaf1dfa5fd40bbf8a0d82613a96b05afccc71653f6d1f24db6674950b58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:10:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2zg8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:10:13Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-kkmm8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:11:06Z is after 2025-08-24T17:21:41Z" Feb 24 10:11:06 crc kubenswrapper[4985]: I0224 10:11:06.469531 4985 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f25863d6-c7fb-47ed-98bf-1e51a5d31280\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:08:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:08:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:09:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:09:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:08:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a122488c66088f525d614bb431c3211b6c4cc2e607703b5b4ab70ed998df6bfd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://80172a23d36c0c1ea92ff824608237fed22c6e46e43ac33db31a9aff8d181b2b\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-24T10:09:09Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0224 10:08:38.386655 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0224 10:08:38.388712 1 observer_polling.go:159] Starting file observer\\\\nI0224 10:08:38.420524 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0224 10:08:38.423674 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nF0224 10:09:09.020819 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-24T10:08:37Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:09:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://171f1effb015bd364e8afd406c496b6312f61c665b54b95b11fb5b203fca4c7e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:08:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://386ac8359b4df2fa31547b13090041850a839a61e549ff58ef8d5c33476e6e14\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:08:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d21b46ed1562aa1c67d31b5ed21686d633a5c8758c65d02566833c8c2b34d6fa\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:08:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:08:36Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:11:06Z is after 2025-08-24T17:21:41Z" Feb 24 10:11:06 crc kubenswrapper[4985]: I0224 10:11:06.480905 4985 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:11:06Z is after 2025-08-24T17:21:41Z" Feb 24 10:11:06 crc kubenswrapper[4985]: I0224 10:11:06.494056 4985 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:11:06Z is after 2025-08-24T17:21:41Z" Feb 24 10:11:06 crc kubenswrapper[4985]: I0224 10:11:06.504250 4985 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-jj7jq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4776cd42-9a5c-4c2e-a585-a1f4a49d2d6d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6f45d6400640ba9b59793abc9aee79bb1c32439ff377abdf8b4ec4d1cd7c336a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:10:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rvmf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:10:13Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-jj7jq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:11:06Z is after 2025-08-24T17:21:41Z" Feb 24 10:11:06 crc kubenswrapper[4985]: I0224 10:11:06.516399 4985 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a6395bdd-0d8e-4572-ae86-695e87aff12e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:08:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:08:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:08:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f9134e7cde3d6c701f928934ed4de91ee13bac8bc62dde5df6fe535a219fb57\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:08:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba53e3e1dfd92b0155e60254f21e53e06da9929560934d7bd95624f4a2bb619c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:08:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d059ea80117284da8c7eb82e4cd878eabba8f9d96102965708b9f6e0f8e96eb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:08:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dea2225b9a0112117f2f0bd6038ef636ac20e0cfcf3d608b720ad7aa2cacfa7c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6ee8b871ab25b79ee67f4e5735673469d19cfb336696170957d2439b0bca13af\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-24T10:09:32Z\\\",\\\"message\\\":\\\"le observer\\\\nW0224 10:09:31.800132 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0224 10:09:31.800281 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0224 10:09:31.801147 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1923306337/tls.crt::/tmp/serving-cert-1923306337/tls.key\\\\\\\"\\\\nI0224 10:09:31.990905 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0224 10:09:31.993717 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0224 10:09:31.993747 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0224 10:09:31.993768 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0224 10:09:31.993775 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0224 10:09:31.997392 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0224 10:09:31.997421 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0224 10:09:31.997427 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0224 10:09:31.997436 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nI0224 10:09:31.997435 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0224 10:09:31.997440 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0224 10:09:31.997467 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0224 10:09:31.997470 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0224 10:09:31.999573 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-24T10:09:31Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:10:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5cf6a4886c046added0a4380f1ca023def94c15e9d5c0ed472c714632684f158\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:08:39Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8da8e0d5478ae714a89c61cf4ca4142bc433bca4d4e4749f069c706abfed8c8f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8da8e0d5478ae714a89c61cf4ca4142bc433bca4d4e4749f069c706abfed8c8f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T10:08:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T10:08:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:08:36Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:11:06Z is after 2025-08-24T17:21:41Z" Feb 24 10:11:06 crc kubenswrapper[4985]: I0224 10:11:06.531103 4985 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-xj9h5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c4bc690c-38e6-488d-97b2-9bf37a916fe4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ab4787f21b4cef49e96547a1d1e7105c79d01d7863afd7ea013f9a38cb40dd06\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:10:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svj5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5aa381e3ea366641ef4fa423fdd46f8afb44b1b2a2ccb6c754007456149b75e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5aa381e3ea366641ef4fa423fdd46f8afb44b1b2a2ccb6c754007456149b75e6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T10:10:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svj5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e071fe863c4e9eb6ebecb3d58d1c4a3c66577cbe46568a4d80a5de5663636539\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e071fe863c4e9eb6ebecb3d58d1c4a3c66577cbe46568a4d80a5de5663636539\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T10:10:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T10:10:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svj5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ac924a3bb0825085e397dbc788641894675e9556fd8aeadcbf9a24a65e1a076\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4ac924a3bb0825085e397dbc788641894675e9556fd8aeadcbf9a24a65e1a076\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T10:10:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T10:10:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svj5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://91acce3dbd19205b8faf23208b7d65c0c47d8156eaf6d7d3357ce1ca64a3bb88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://91acce3dbd19205b8faf23208b7d65c0c47d8156eaf6d7d3357ce1ca64a3bb88\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T10:10:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T10:10:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svj5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://81482396eb82b6ef2f09d03cf5e6bc785209394a50c63b199581fc70120bb9fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://81482396eb82b6ef2f09d03cf5e6bc785209394a50c63b199581fc70120bb9fd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T10:10:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T10:10:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svj5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://556b84b0e9e72b0c26c58defa2c030943daffa83d97b60293cc5b5627a6c9174\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://556b84b0e9e72b0c26c58defa2c030943daffa83d97b60293cc5b5627a6c9174\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T10:10:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T10:10:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svj5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:10:13Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-xj9h5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:11:06Z is after 2025-08-24T17:21:41Z" Feb 24 10:11:06 crc kubenswrapper[4985]: I0224 10:11:06.542196 4985 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-hq52w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"11c1c7b8-18df-4583-849f-76b62544344b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8546cf975b145528b7f94bd03c6570de0db1a059477da5795b7569250bde3ca5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:10:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kcfbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://14cb7316040a9cb3cb3236ed16ecc17eee99c6935a646b86604ea8285e1aab0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:10:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kcfbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:10:13Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-hq52w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:11:06Z is after 2025-08-24T17:21:41Z" Feb 24 10:11:06 crc kubenswrapper[4985]: I0224 10:11:06.552786 4985 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:11:06Z is after 2025-08-24T17:21:41Z" Feb 24 10:11:06 crc kubenswrapper[4985]: I0224 10:11:06.572074 4985 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://137c9b42aa2109e16d0e273e3e7d116db30ed716c1e6f78fb8f4e87409e3f252\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:10:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://40736fa6ee59064a7fc5bed9b08671d094e7a11b6a8bf4ec45220a0d62156542\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:10:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:11:06Z is after 2025-08-24T17:21:41Z" Feb 24 10:11:07 crc kubenswrapper[4985]: I0224 10:11:07.264531 4985 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 24 10:11:07 crc kubenswrapper[4985]: I0224 10:11:07.264641 4985 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xkc65" Feb 24 10:11:07 crc kubenswrapper[4985]: I0224 10:11:07.264539 4985 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 24 10:11:07 crc kubenswrapper[4985]: E0224 10:11:07.264813 4985 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 24 10:11:07 crc kubenswrapper[4985]: E0224 10:11:07.264965 4985 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xkc65" podUID="d4340d1a-60cb-4240-87ba-1e468c9c41cf" Feb 24 10:11:07 crc kubenswrapper[4985]: E0224 10:11:07.265098 4985 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 24 10:11:07 crc kubenswrapper[4985]: I0224 10:11:07.265118 4985 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 24 10:11:07 crc kubenswrapper[4985]: E0224 10:11:07.265371 4985 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 24 10:11:08 crc kubenswrapper[4985]: I0224 10:11:08.513492 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:11:08 crc kubenswrapper[4985]: I0224 10:11:08.513565 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:11:08 crc kubenswrapper[4985]: I0224 10:11:08.513583 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:11:08 crc kubenswrapper[4985]: I0224 10:11:08.513607 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 10:11:08 crc kubenswrapper[4985]: I0224 10:11:08.513625 4985 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T10:11:08Z","lastTransitionTime":"2026-02-24T10:11:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 10:11:08 crc kubenswrapper[4985]: E0224 10:11:08.531203 4985 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T10:11:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T10:11:08Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T10:11:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T10:11:08Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T10:11:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T10:11:08Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T10:11:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T10:11:08Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"6fa96d71-6980-4f45-b02a-32602082091f\\\",\\\"systemUUID\\\":\\\"77848161-2cfb-4f0a-8e60-0ac9426710fd\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:11:08Z is after 2025-08-24T17:21:41Z" Feb 24 10:11:08 crc kubenswrapper[4985]: I0224 10:11:08.536314 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:11:08 crc kubenswrapper[4985]: I0224 10:11:08.536354 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:11:08 crc kubenswrapper[4985]: I0224 10:11:08.536364 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:11:08 crc kubenswrapper[4985]: I0224 10:11:08.536380 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 10:11:08 crc kubenswrapper[4985]: I0224 10:11:08.536390 4985 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T10:11:08Z","lastTransitionTime":"2026-02-24T10:11:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 10:11:08 crc kubenswrapper[4985]: E0224 10:11:08.553359 4985 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T10:11:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T10:11:08Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T10:11:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T10:11:08Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T10:11:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T10:11:08Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T10:11:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T10:11:08Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"6fa96d71-6980-4f45-b02a-32602082091f\\\",\\\"systemUUID\\\":\\\"77848161-2cfb-4f0a-8e60-0ac9426710fd\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:11:08Z is after 2025-08-24T17:21:41Z" Feb 24 10:11:08 crc kubenswrapper[4985]: I0224 10:11:08.557702 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:11:08 crc kubenswrapper[4985]: I0224 10:11:08.557752 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:11:08 crc kubenswrapper[4985]: I0224 10:11:08.557769 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:11:08 crc kubenswrapper[4985]: I0224 10:11:08.557791 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 10:11:08 crc kubenswrapper[4985]: I0224 10:11:08.557808 4985 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T10:11:08Z","lastTransitionTime":"2026-02-24T10:11:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 10:11:08 crc kubenswrapper[4985]: E0224 10:11:08.573034 4985 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T10:11:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T10:11:08Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T10:11:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T10:11:08Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T10:11:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T10:11:08Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T10:11:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T10:11:08Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"6fa96d71-6980-4f45-b02a-32602082091f\\\",\\\"systemUUID\\\":\\\"77848161-2cfb-4f0a-8e60-0ac9426710fd\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:11:08Z is after 2025-08-24T17:21:41Z" Feb 24 10:11:08 crc kubenswrapper[4985]: I0224 10:11:08.577011 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:11:08 crc kubenswrapper[4985]: I0224 10:11:08.577044 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:11:08 crc kubenswrapper[4985]: I0224 10:11:08.577053 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:11:08 crc kubenswrapper[4985]: I0224 10:11:08.577067 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 10:11:08 crc kubenswrapper[4985]: I0224 10:11:08.577078 4985 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T10:11:08Z","lastTransitionTime":"2026-02-24T10:11:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 10:11:08 crc kubenswrapper[4985]: E0224 10:11:08.592862 4985 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T10:11:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T10:11:08Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T10:11:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T10:11:08Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T10:11:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T10:11:08Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T10:11:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T10:11:08Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"6fa96d71-6980-4f45-b02a-32602082091f\\\",\\\"systemUUID\\\":\\\"77848161-2cfb-4f0a-8e60-0ac9426710fd\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:11:08Z is after 2025-08-24T17:21:41Z" Feb 24 10:11:08 crc kubenswrapper[4985]: I0224 10:11:08.596510 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:11:08 crc kubenswrapper[4985]: I0224 10:11:08.596559 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:11:08 crc kubenswrapper[4985]: I0224 10:11:08.596570 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:11:08 crc kubenswrapper[4985]: I0224 10:11:08.596589 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 10:11:08 crc kubenswrapper[4985]: I0224 10:11:08.596604 4985 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T10:11:08Z","lastTransitionTime":"2026-02-24T10:11:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 10:11:08 crc kubenswrapper[4985]: E0224 10:11:08.610697 4985 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T10:11:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T10:11:08Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T10:11:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T10:11:08Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T10:11:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T10:11:08Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T10:11:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T10:11:08Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"6fa96d71-6980-4f45-b02a-32602082091f\\\",\\\"systemUUID\\\":\\\"77848161-2cfb-4f0a-8e60-0ac9426710fd\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:11:08Z is after 2025-08-24T17:21:41Z" Feb 24 10:11:08 crc kubenswrapper[4985]: E0224 10:11:08.611047 4985 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Feb 24 10:11:09 crc kubenswrapper[4985]: I0224 10:11:09.263914 4985 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 24 10:11:09 crc kubenswrapper[4985]: I0224 10:11:09.263997 4985 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xkc65" Feb 24 10:11:09 crc kubenswrapper[4985]: E0224 10:11:09.264193 4985 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 24 10:11:09 crc kubenswrapper[4985]: I0224 10:11:09.264235 4985 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 24 10:11:09 crc kubenswrapper[4985]: I0224 10:11:09.264243 4985 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 24 10:11:09 crc kubenswrapper[4985]: E0224 10:11:09.264758 4985 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 24 10:11:09 crc kubenswrapper[4985]: E0224 10:11:09.264995 4985 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 24 10:11:09 crc kubenswrapper[4985]: E0224 10:11:09.265154 4985 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xkc65" podUID="d4340d1a-60cb-4240-87ba-1e468c9c41cf" Feb 24 10:11:09 crc kubenswrapper[4985]: I0224 10:11:09.265253 4985 scope.go:117] "RemoveContainer" containerID="0aec8312b8a8ac9d698c0e1253f847faf52c391d69d8276e73b1e774a6819807" Feb 24 10:11:09 crc kubenswrapper[4985]: I0224 10:11:09.974221 4985 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-27dpt_1b3986ef-e9be-43db-9350-ccc7dd3f713f/ovnkube-controller/2.log" Feb 24 10:11:09 crc kubenswrapper[4985]: I0224 10:11:09.977418 4985 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-27dpt" event={"ID":"1b3986ef-e9be-43db-9350-ccc7dd3f713f","Type":"ContainerStarted","Data":"df7ed5513fec3c391f8a44d1039dd8a9d0e7c587428383737726821dfab5d0b1"} Feb 24 10:11:09 crc kubenswrapper[4985]: I0224 10:11:09.977849 4985 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-27dpt" Feb 24 10:11:09 crc kubenswrapper[4985]: I0224 10:11:09.991636 4985 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:11:09Z is after 2025-08-24T17:21:41Z" Feb 24 10:11:10 crc kubenswrapper[4985]: I0224 10:11:10.003478 4985 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://137c9b42aa2109e16d0e273e3e7d116db30ed716c1e6f78fb8f4e87409e3f252\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:10:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://40736fa6ee59064a7fc5bed9b08671d094e7a11b6a8bf4ec45220a0d62156542\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:10:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:11:10Z is after 2025-08-24T17:21:41Z" Feb 24 10:11:10 crc kubenswrapper[4985]: I0224 10:11:10.013798 4985 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-xkc65" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d4340d1a-60cb-4240-87ba-1e468c9c41cf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lvcfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lvcfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:10:13Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-xkc65\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:11:10Z is after 2025-08-24T17:21:41Z" Feb 24 10:11:10 crc kubenswrapper[4985]: I0224 10:11:10.036290 4985 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-27dpt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1b3986ef-e9be-43db-9350-ccc7dd3f713f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://97f91c2ddd00ceb58ea5691c7cb057bd7cfe8add5fd1dfd334f39b9def30db0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:10:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9n6c9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://af211a7b5dbd1e1f86f340d63afe03ad2f9691d166810837cc5988a24ad4f323\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:10:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9n6c9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3d76cc1e57df7809a67ce69b6690fd8cbe0ef9c364406b03bbb37bd110d65866\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:10:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9n6c9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cb7b5b1f27320e4bc00e48154c3bb41b1153e36c39a404cfc1d4a246f5909c5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:10:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9n6c9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e04b766ff2b4eb366ace7cbf978ef84bfa922494ec3d28f4cbd75efe1cb54f1e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:10:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9n6c9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7630d2e046fcee78924010a8de1aefcdf7c24ef4e70c4ea89c193983a0d88a61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:10:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9n6c9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://df7ed5513fec3c391f8a44d1039dd8a9d0e7c587428383737726821dfab5d0b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0aec8312b8a8ac9d698c0e1253f847faf52c391d69d8276e73b1e774a6819807\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-24T10:10:41Z\\\",\\\"message\\\":\\\"ft-etcd/etcd-crc after 0 failed attempt(s)\\\\nI0224 10:10:41.298714 7041 default_network_controller.go:776] Recording success event on pod openshift-etcd/etcd-crc\\\\nI0224 10:10:41.298592 7041 ovnkube.go:599] Stopped ovnkube\\\\nI0224 10:10:41.298707 7041 transact.go:42] Configuring OVN: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-marketplace/certified-operators]} name:Service_openshift-marketplace/certified-operators_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.214:50051:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {20da2226-531c-4179-9810-aa4026995ca3}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0224 10:10:41.298743 7041 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI0224 10:10:41.298781 7041 loadbalancer.go:304] Deleted 0 stale LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-kube-controller-manager/kube-controller-manager\\\\\\\"}\\\\nF0224 10:10:41.298810 7041 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-24T10:10:40Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:11:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9n6c9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e3faa6c1361aca1d102b6dbfaf38fa3094a5c2334713ce70a8bf2f58f19d6392\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:10:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9n6c9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8de70126733b3634f4e3c1576c3095c089e9628a6147ce8ea79fa5242a2eb920\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8de70126733b3634f4e3c1576c3095c089e9628a6147ce8ea79fa5242a2eb920\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T10:10:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9n6c9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:10:13Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-27dpt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:11:10Z is after 2025-08-24T17:21:41Z" Feb 24 10:11:10 crc kubenswrapper[4985]: I0224 10:11:10.046756 4985 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-qtqms" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c43e3252-1b22-48a0-8895-ad98fc88b7cf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://778506fd4d9af7b6ebd4e4d46282938146ba0184d9747ca657957c78118c10e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:10:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jdldr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:10:13Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-qtqms\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:11:10Z is after 2025-08-24T17:21:41Z" Feb 24 10:11:10 crc kubenswrapper[4985]: I0224 10:11:10.057093 4985 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"71a2eb97-3905-45f5-8448-ab8a6e27e0fb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:08:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:08:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:09:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:09:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:08:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://30ed8363bc5fff9f3b57a12ef0133eae4aa1447b427d1fc98e49e4155a154dab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:08:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://60b8a045a92dbfe12e3eed0e5ec7fda3284a16cb5a1ab0296ed1bd44c427f4b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:08:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6163846a3f1d508e3ea6bad31b1818569d4339c10d24284aa2360a98f4f78351\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:08:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6388ea35afa9218a9ebe327d6c6a9b35c7b92bd77a1f95e4e569ba759ec9aeb2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6388ea35afa9218a9ebe327d6c6a9b35c7b92bd77a1f95e4e569ba759ec9aeb2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T10:08:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T10:08:37Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:08:36Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:11:10Z is after 2025-08-24T17:21:41Z" Feb 24 10:11:10 crc kubenswrapper[4985]: I0224 10:11:10.074764 4985 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"42276a15-2a73-4687-91ad-d5ff65f8526d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:08:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:08:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:08:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:08:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:08:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://239bae44bdfe47737a87a7c7ffe103f138acfb89e1eb16c3901b25e8a0fa12e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:08:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aeb360380a66a7d4923e8db278af3a2024d5ed789e85e1427aaad600c2754224\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:08:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7968a5b1a705405cdca66b1c914475bd58b8c511980dfe312abd7ddf447d6375\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:08:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://813f41fed199f25206b4e475a175beaa3f3223c8b4f946e533f4290f64dbd069\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:08:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2a1fe7d1c64e5170ec0ce50343eb2a81b002e5b8bb4160c5ab4abafcc4ece67b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:08:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f5db1ab98b50bb081b07ffa0a8660ef886047d941d9b8509f484011950d0a94a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f5db1ab98b50bb081b07ffa0a8660ef886047d941d9b8509f484011950d0a94a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T10:08:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T10:08:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9edf4a1146efaf26193d3f8c63eb95cd945e7be5d36dc825d24053417ac5b1ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9edf4a1146efaf26193d3f8c63eb95cd945e7be5d36dc825d24053417ac5b1ac\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T10:08:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T10:08:38Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://2592e50d0dc75b9c89a94486e27887c6487e21a72ad53055b1b9af0cdfd73ec9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2592e50d0dc75b9c89a94486e27887c6487e21a72ad53055b1b9af0cdfd73ec9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T10:08:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T10:08:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:08:36Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:11:10Z is after 2025-08-24T17:21:41Z" Feb 24 10:11:10 crc kubenswrapper[4985]: I0224 10:11:10.089430 4985 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c9e0873cd1521fe3cd14005f22118db4616043670f4b895435e71ddec9d82a0e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:10:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:11:10Z is after 2025-08-24T17:21:41Z" Feb 24 10:11:10 crc kubenswrapper[4985]: I0224 10:11:10.101047 4985 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:16Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:16Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e2f9d78f6a52c55d1f70fa119f84fdb11505d511d394a8ed61a298e79dfb2dcd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:10:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:11:10Z is after 2025-08-24T17:21:41Z" Feb 24 10:11:10 crc kubenswrapper[4985]: I0224 10:11:10.115007 4985 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f25863d6-c7fb-47ed-98bf-1e51a5d31280\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:08:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:08:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:09:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:09:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:08:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a122488c66088f525d614bb431c3211b6c4cc2e607703b5b4ab70ed998df6bfd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://80172a23d36c0c1ea92ff824608237fed22c6e46e43ac33db31a9aff8d181b2b\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-24T10:09:09Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0224 10:08:38.386655 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0224 10:08:38.388712 1 observer_polling.go:159] Starting file observer\\\\nI0224 10:08:38.420524 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0224 10:08:38.423674 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nF0224 10:09:09.020819 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-24T10:08:37Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:09:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://171f1effb015bd364e8afd406c496b6312f61c665b54b95b11fb5b203fca4c7e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:08:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://386ac8359b4df2fa31547b13090041850a839a61e549ff58ef8d5c33476e6e14\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:08:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d21b46ed1562aa1c67d31b5ed21686d633a5c8758c65d02566833c8c2b34d6fa\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:08:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:08:36Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:11:10Z is after 2025-08-24T17:21:41Z" Feb 24 10:11:10 crc kubenswrapper[4985]: I0224 10:11:10.132362 4985 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:11:10Z is after 2025-08-24T17:21:41Z" Feb 24 10:11:10 crc kubenswrapper[4985]: I0224 10:11:10.146403 4985 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:11:10Z is after 2025-08-24T17:21:41Z" Feb 24 10:11:10 crc kubenswrapper[4985]: I0224 10:11:10.160281 4985 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-q24bf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"731349d2-7b07-4bc9-81f8-c7d75bca842a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:11:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:11:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9ab2ed5b6b7b76ded6468be8fb5375bccf9f79a9b916fd1cc4fc2bc192140eb4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://85e965ad144b3440864b3c9de70aacd2a2e49c715dca204b0e4cfae65954cd9e\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-24T10:11:00Z\\\",\\\"message\\\":\\\"2026-02-24T10:10:15+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_482b94f0-6a0d-4f69-917f-1ca7bdfc7bb0\\\\n2026-02-24T10:10:15+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_482b94f0-6a0d-4f69-917f-1ca7bdfc7bb0 to /host/opt/cni/bin/\\\\n2026-02-24T10:10:15Z [verbose] multus-daemon started\\\\n2026-02-24T10:10:15Z [verbose] Readiness Indicator file check\\\\n2026-02-24T10:11:00Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-24T10:10:13Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:11:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5g9sh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:10:13Z\\\"}}\" for pod \"openshift-multus\"/\"multus-q24bf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:11:10Z is after 2025-08-24T17:21:41Z" Feb 24 10:11:10 crc kubenswrapper[4985]: I0224 10:11:10.173156 4985 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-kkmm8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c3a34c00-910b-400b-bc96-9d805e076b7c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://65682d7b02ccbc0c344ab0b8eaf55d227d0b6e4a04e56d1038d51aa30747b068\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:10:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2zg8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://52f2dbaf1dfa5fd40bbf8a0d82613a96b05afccc71653f6d1f24db6674950b58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:10:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2zg8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:10:13Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-kkmm8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:11:10Z is after 2025-08-24T17:21:41Z" Feb 24 10:11:10 crc kubenswrapper[4985]: I0224 10:11:10.188634 4985 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a6395bdd-0d8e-4572-ae86-695e87aff12e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:08:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:08:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:08:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f9134e7cde3d6c701f928934ed4de91ee13bac8bc62dde5df6fe535a219fb57\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:08:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba53e3e1dfd92b0155e60254f21e53e06da9929560934d7bd95624f4a2bb619c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:08:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d059ea80117284da8c7eb82e4cd878eabba8f9d96102965708b9f6e0f8e96eb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:08:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dea2225b9a0112117f2f0bd6038ef636ac20e0cfcf3d608b720ad7aa2cacfa7c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6ee8b871ab25b79ee67f4e5735673469d19cfb336696170957d2439b0bca13af\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-24T10:09:32Z\\\",\\\"message\\\":\\\"le observer\\\\nW0224 10:09:31.800132 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0224 10:09:31.800281 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0224 10:09:31.801147 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1923306337/tls.crt::/tmp/serving-cert-1923306337/tls.key\\\\\\\"\\\\nI0224 10:09:31.990905 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0224 10:09:31.993717 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0224 10:09:31.993747 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0224 10:09:31.993768 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0224 10:09:31.993775 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0224 10:09:31.997392 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0224 10:09:31.997421 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0224 10:09:31.997427 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0224 10:09:31.997436 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nI0224 10:09:31.997435 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0224 10:09:31.997440 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0224 10:09:31.997467 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0224 10:09:31.997470 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0224 10:09:31.999573 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-24T10:09:31Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:10:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5cf6a4886c046added0a4380f1ca023def94c15e9d5c0ed472c714632684f158\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:08:39Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8da8e0d5478ae714a89c61cf4ca4142bc433bca4d4e4749f069c706abfed8c8f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8da8e0d5478ae714a89c61cf4ca4142bc433bca4d4e4749f069c706abfed8c8f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T10:08:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T10:08:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:08:36Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:11:10Z is after 2025-08-24T17:21:41Z" Feb 24 10:11:10 crc kubenswrapper[4985]: I0224 10:11:10.202596 4985 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-xj9h5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c4bc690c-38e6-488d-97b2-9bf37a916fe4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ab4787f21b4cef49e96547a1d1e7105c79d01d7863afd7ea013f9a38cb40dd06\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:10:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svj5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5aa381e3ea366641ef4fa423fdd46f8afb44b1b2a2ccb6c754007456149b75e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5aa381e3ea366641ef4fa423fdd46f8afb44b1b2a2ccb6c754007456149b75e6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T10:10:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svj5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e071fe863c4e9eb6ebecb3d58d1c4a3c66577cbe46568a4d80a5de5663636539\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e071fe863c4e9eb6ebecb3d58d1c4a3c66577cbe46568a4d80a5de5663636539\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T10:10:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T10:10:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svj5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ac924a3bb0825085e397dbc788641894675e9556fd8aeadcbf9a24a65e1a076\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4ac924a3bb0825085e397dbc788641894675e9556fd8aeadcbf9a24a65e1a076\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T10:10:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T10:10:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svj5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://91acce3dbd19205b8faf23208b7d65c0c47d8156eaf6d7d3357ce1ca64a3bb88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://91acce3dbd19205b8faf23208b7d65c0c47d8156eaf6d7d3357ce1ca64a3bb88\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T10:10:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T10:10:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svj5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://81482396eb82b6ef2f09d03cf5e6bc785209394a50c63b199581fc70120bb9fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://81482396eb82b6ef2f09d03cf5e6bc785209394a50c63b199581fc70120bb9fd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T10:10:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T10:10:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svj5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://556b84b0e9e72b0c26c58defa2c030943daffa83d97b60293cc5b5627a6c9174\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://556b84b0e9e72b0c26c58defa2c030943daffa83d97b60293cc5b5627a6c9174\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T10:10:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T10:10:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svj5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:10:13Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-xj9h5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:11:10Z is after 2025-08-24T17:21:41Z" Feb 24 10:11:10 crc kubenswrapper[4985]: I0224 10:11:10.218443 4985 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-hq52w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"11c1c7b8-18df-4583-849f-76b62544344b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8546cf975b145528b7f94bd03c6570de0db1a059477da5795b7569250bde3ca5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:10:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kcfbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://14cb7316040a9cb3cb3236ed16ecc17eee99c6935a646b86604ea8285e1aab0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:10:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kcfbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:10:13Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-hq52w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:11:10Z is after 2025-08-24T17:21:41Z" Feb 24 10:11:10 crc kubenswrapper[4985]: I0224 10:11:10.229707 4985 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-jj7jq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4776cd42-9a5c-4c2e-a585-a1f4a49d2d6d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6f45d6400640ba9b59793abc9aee79bb1c32439ff377abdf8b4ec4d1cd7c336a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:10:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rvmf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:10:13Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-jj7jq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:11:10Z is after 2025-08-24T17:21:41Z" Feb 24 10:11:10 crc kubenswrapper[4985]: I0224 10:11:10.982421 4985 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-27dpt_1b3986ef-e9be-43db-9350-ccc7dd3f713f/ovnkube-controller/3.log" Feb 24 10:11:10 crc kubenswrapper[4985]: I0224 10:11:10.983220 4985 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-27dpt_1b3986ef-e9be-43db-9350-ccc7dd3f713f/ovnkube-controller/2.log" Feb 24 10:11:10 crc kubenswrapper[4985]: I0224 10:11:10.986036 4985 generic.go:334] "Generic (PLEG): container finished" podID="1b3986ef-e9be-43db-9350-ccc7dd3f713f" containerID="df7ed5513fec3c391f8a44d1039dd8a9d0e7c587428383737726821dfab5d0b1" exitCode=1 Feb 24 10:11:10 crc kubenswrapper[4985]: I0224 10:11:10.986075 4985 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-27dpt" event={"ID":"1b3986ef-e9be-43db-9350-ccc7dd3f713f","Type":"ContainerDied","Data":"df7ed5513fec3c391f8a44d1039dd8a9d0e7c587428383737726821dfab5d0b1"} Feb 24 10:11:10 crc kubenswrapper[4985]: I0224 10:11:10.986106 4985 scope.go:117] "RemoveContainer" containerID="0aec8312b8a8ac9d698c0e1253f847faf52c391d69d8276e73b1e774a6819807" Feb 24 10:11:10 crc kubenswrapper[4985]: I0224 10:11:10.987467 4985 scope.go:117] "RemoveContainer" containerID="df7ed5513fec3c391f8a44d1039dd8a9d0e7c587428383737726821dfab5d0b1" Feb 24 10:11:10 crc kubenswrapper[4985]: E0224 10:11:10.987752 4985 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-27dpt_openshift-ovn-kubernetes(1b3986ef-e9be-43db-9350-ccc7dd3f713f)\"" pod="openshift-ovn-kubernetes/ovnkube-node-27dpt" podUID="1b3986ef-e9be-43db-9350-ccc7dd3f713f" Feb 24 10:11:11 crc kubenswrapper[4985]: I0224 10:11:11.003920 4985 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a6395bdd-0d8e-4572-ae86-695e87aff12e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:08:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:08:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:08:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f9134e7cde3d6c701f928934ed4de91ee13bac8bc62dde5df6fe535a219fb57\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:08:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba53e3e1dfd92b0155e60254f21e53e06da9929560934d7bd95624f4a2bb619c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:08:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d059ea80117284da8c7eb82e4cd878eabba8f9d96102965708b9f6e0f8e96eb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:08:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dea2225b9a0112117f2f0bd6038ef636ac20e0cfcf3d608b720ad7aa2cacfa7c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6ee8b871ab25b79ee67f4e5735673469d19cfb336696170957d2439b0bca13af\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-24T10:09:32Z\\\",\\\"message\\\":\\\"le observer\\\\nW0224 10:09:31.800132 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0224 10:09:31.800281 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0224 10:09:31.801147 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1923306337/tls.crt::/tmp/serving-cert-1923306337/tls.key\\\\\\\"\\\\nI0224 10:09:31.990905 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0224 10:09:31.993717 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0224 10:09:31.993747 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0224 10:09:31.993768 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0224 10:09:31.993775 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0224 10:09:31.997392 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0224 10:09:31.997421 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0224 10:09:31.997427 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0224 10:09:31.997436 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nI0224 10:09:31.997435 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0224 10:09:31.997440 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0224 10:09:31.997467 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0224 10:09:31.997470 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0224 10:09:31.999573 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-24T10:09:31Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:10:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5cf6a4886c046added0a4380f1ca023def94c15e9d5c0ed472c714632684f158\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:08:39Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8da8e0d5478ae714a89c61cf4ca4142bc433bca4d4e4749f069c706abfed8c8f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8da8e0d5478ae714a89c61cf4ca4142bc433bca4d4e4749f069c706abfed8c8f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T10:08:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T10:08:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:08:36Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:11:11Z is after 2025-08-24T17:21:41Z" Feb 24 10:11:11 crc kubenswrapper[4985]: I0224 10:11:11.020255 4985 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-xj9h5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c4bc690c-38e6-488d-97b2-9bf37a916fe4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ab4787f21b4cef49e96547a1d1e7105c79d01d7863afd7ea013f9a38cb40dd06\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:10:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svj5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5aa381e3ea366641ef4fa423fdd46f8afb44b1b2a2ccb6c754007456149b75e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5aa381e3ea366641ef4fa423fdd46f8afb44b1b2a2ccb6c754007456149b75e6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T10:10:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svj5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e071fe863c4e9eb6ebecb3d58d1c4a3c66577cbe46568a4d80a5de5663636539\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e071fe863c4e9eb6ebecb3d58d1c4a3c66577cbe46568a4d80a5de5663636539\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T10:10:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T10:10:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svj5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ac924a3bb0825085e397dbc788641894675e9556fd8aeadcbf9a24a65e1a076\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4ac924a3bb0825085e397dbc788641894675e9556fd8aeadcbf9a24a65e1a076\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T10:10:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T10:10:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svj5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://91acce3dbd19205b8faf23208b7d65c0c47d8156eaf6d7d3357ce1ca64a3bb88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://91acce3dbd19205b8faf23208b7d65c0c47d8156eaf6d7d3357ce1ca64a3bb88\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T10:10:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T10:10:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svj5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://81482396eb82b6ef2f09d03cf5e6bc785209394a50c63b199581fc70120bb9fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://81482396eb82b6ef2f09d03cf5e6bc785209394a50c63b199581fc70120bb9fd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T10:10:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T10:10:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svj5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://556b84b0e9e72b0c26c58defa2c030943daffa83d97b60293cc5b5627a6c9174\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://556b84b0e9e72b0c26c58defa2c030943daffa83d97b60293cc5b5627a6c9174\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T10:10:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T10:10:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svj5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:10:13Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-xj9h5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:11:11Z is after 2025-08-24T17:21:41Z" Feb 24 10:11:11 crc kubenswrapper[4985]: I0224 10:11:11.031407 4985 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-hq52w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"11c1c7b8-18df-4583-849f-76b62544344b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8546cf975b145528b7f94bd03c6570de0db1a059477da5795b7569250bde3ca5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:10:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kcfbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://14cb7316040a9cb3cb3236ed16ecc17eee99c6935a646b86604ea8285e1aab0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:10:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kcfbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:10:13Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-hq52w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:11:11Z is after 2025-08-24T17:21:41Z" Feb 24 10:11:11 crc kubenswrapper[4985]: I0224 10:11:11.041055 4985 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-jj7jq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4776cd42-9a5c-4c2e-a585-a1f4a49d2d6d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6f45d6400640ba9b59793abc9aee79bb1c32439ff377abdf8b4ec4d1cd7c336a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:10:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rvmf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:10:13Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-jj7jq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:11:11Z is after 2025-08-24T17:21:41Z" Feb 24 10:11:11 crc kubenswrapper[4985]: I0224 10:11:11.051152 4985 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:11:11Z is after 2025-08-24T17:21:41Z" Feb 24 10:11:11 crc kubenswrapper[4985]: I0224 10:11:11.063409 4985 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://137c9b42aa2109e16d0e273e3e7d116db30ed716c1e6f78fb8f4e87409e3f252\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:10:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://40736fa6ee59064a7fc5bed9b08671d094e7a11b6a8bf4ec45220a0d62156542\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:10:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:11:11Z is after 2025-08-24T17:21:41Z" Feb 24 10:11:11 crc kubenswrapper[4985]: I0224 10:11:11.073323 4985 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"71a2eb97-3905-45f5-8448-ab8a6e27e0fb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:08:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:08:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:09:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:09:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:08:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://30ed8363bc5fff9f3b57a12ef0133eae4aa1447b427d1fc98e49e4155a154dab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:08:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://60b8a045a92dbfe12e3eed0e5ec7fda3284a16cb5a1ab0296ed1bd44c427f4b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:08:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6163846a3f1d508e3ea6bad31b1818569d4339c10d24284aa2360a98f4f78351\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:08:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6388ea35afa9218a9ebe327d6c6a9b35c7b92bd77a1f95e4e569ba759ec9aeb2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6388ea35afa9218a9ebe327d6c6a9b35c7b92bd77a1f95e4e569ba759ec9aeb2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T10:08:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T10:08:37Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:08:36Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:11:11Z is after 2025-08-24T17:21:41Z" Feb 24 10:11:11 crc kubenswrapper[4985]: I0224 10:11:11.091452 4985 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"42276a15-2a73-4687-91ad-d5ff65f8526d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:08:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:08:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:08:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:08:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:08:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://239bae44bdfe47737a87a7c7ffe103f138acfb89e1eb16c3901b25e8a0fa12e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:08:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aeb360380a66a7d4923e8db278af3a2024d5ed789e85e1427aaad600c2754224\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:08:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7968a5b1a705405cdca66b1c914475bd58b8c511980dfe312abd7ddf447d6375\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:08:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://813f41fed199f25206b4e475a175beaa3f3223c8b4f946e533f4290f64dbd069\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:08:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2a1fe7d1c64e5170ec0ce50343eb2a81b002e5b8bb4160c5ab4abafcc4ece67b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:08:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f5db1ab98b50bb081b07ffa0a8660ef886047d941d9b8509f484011950d0a94a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f5db1ab98b50bb081b07ffa0a8660ef886047d941d9b8509f484011950d0a94a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T10:08:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T10:08:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9edf4a1146efaf26193d3f8c63eb95cd945e7be5d36dc825d24053417ac5b1ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9edf4a1146efaf26193d3f8c63eb95cd945e7be5d36dc825d24053417ac5b1ac\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T10:08:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T10:08:38Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://2592e50d0dc75b9c89a94486e27887c6487e21a72ad53055b1b9af0cdfd73ec9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2592e50d0dc75b9c89a94486e27887c6487e21a72ad53055b1b9af0cdfd73ec9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T10:08:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T10:08:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:08:36Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:11:11Z is after 2025-08-24T17:21:41Z" Feb 24 10:11:11 crc kubenswrapper[4985]: I0224 10:11:11.109063 4985 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c9e0873cd1521fe3cd14005f22118db4616043670f4b895435e71ddec9d82a0e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:10:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:11:11Z is after 2025-08-24T17:21:41Z" Feb 24 10:11:11 crc kubenswrapper[4985]: I0224 10:11:11.120018 4985 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:16Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:16Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e2f9d78f6a52c55d1f70fa119f84fdb11505d511d394a8ed61a298e79dfb2dcd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:10:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:11:11Z is after 2025-08-24T17:21:41Z" Feb 24 10:11:11 crc kubenswrapper[4985]: I0224 10:11:11.134938 4985 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-xkc65" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d4340d1a-60cb-4240-87ba-1e468c9c41cf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lvcfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lvcfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:10:13Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-xkc65\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:11:11Z is after 2025-08-24T17:21:41Z" Feb 24 10:11:11 crc kubenswrapper[4985]: I0224 10:11:11.163165 4985 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-27dpt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1b3986ef-e9be-43db-9350-ccc7dd3f713f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://97f91c2ddd00ceb58ea5691c7cb057bd7cfe8add5fd1dfd334f39b9def30db0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:10:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9n6c9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://af211a7b5dbd1e1f86f340d63afe03ad2f9691d166810837cc5988a24ad4f323\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:10:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9n6c9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3d76cc1e57df7809a67ce69b6690fd8cbe0ef9c364406b03bbb37bd110d65866\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:10:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9n6c9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cb7b5b1f27320e4bc00e48154c3bb41b1153e36c39a404cfc1d4a246f5909c5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:10:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9n6c9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e04b766ff2b4eb366ace7cbf978ef84bfa922494ec3d28f4cbd75efe1cb54f1e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:10:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9n6c9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7630d2e046fcee78924010a8de1aefcdf7c24ef4e70c4ea89c193983a0d88a61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:10:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9n6c9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://df7ed5513fec3c391f8a44d1039dd8a9d0e7c587428383737726821dfab5d0b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0aec8312b8a8ac9d698c0e1253f847faf52c391d69d8276e73b1e774a6819807\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-24T10:10:41Z\\\",\\\"message\\\":\\\"ft-etcd/etcd-crc after 0 failed attempt(s)\\\\nI0224 10:10:41.298714 7041 default_network_controller.go:776] Recording success event on pod openshift-etcd/etcd-crc\\\\nI0224 10:10:41.298592 7041 ovnkube.go:599] Stopped ovnkube\\\\nI0224 10:10:41.298707 7041 transact.go:42] Configuring OVN: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-marketplace/certified-operators]} name:Service_openshift-marketplace/certified-operators_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.214:50051:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {20da2226-531c-4179-9810-aa4026995ca3}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0224 10:10:41.298743 7041 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI0224 10:10:41.298781 7041 loadbalancer.go:304] Deleted 0 stale LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-kube-controller-manager/kube-controller-manager\\\\\\\"}\\\\nF0224 10:10:41.298810 7041 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-24T10:10:40Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://df7ed5513fec3c391f8a44d1039dd8a9d0e7c587428383737726821dfab5d0b1\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-24T10:11:10Z\\\",\\\"message\\\":\\\"Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI0224 10:11:10.206860 7359 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0224 10:11:10.207012 7359 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0224 10:11:10.207040 7359 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0224 10:11:10.208015 7359 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0224 10:11:10.208045 7359 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0224 10:11:10.208051 7359 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0224 10:11:10.208063 7359 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0224 10:11:10.208092 7359 factory.go:656] Stopping watch factory\\\\nI0224 10:11:10.208106 7359 ovnkube.go:599] Stopped ovnkube\\\\nI0224 10:11:10.208139 7359 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0224 10:11:10.208155 7359 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0224 10:11:10.208162 7359 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0224 1\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-24T10:11:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9n6c9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e3faa6c1361aca1d102b6dbfaf38fa3094a5c2334713ce70a8bf2f58f19d6392\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:10:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9n6c9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8de70126733b3634f4e3c1576c3095c089e9628a6147ce8ea79fa5242a2eb920\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8de70126733b3634f4e3c1576c3095c089e9628a6147ce8ea79fa5242a2eb920\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T10:10:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9n6c9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:10:13Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-27dpt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:11:11Z is after 2025-08-24T17:21:41Z" Feb 24 10:11:11 crc kubenswrapper[4985]: I0224 10:11:11.175602 4985 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-qtqms" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c43e3252-1b22-48a0-8895-ad98fc88b7cf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://778506fd4d9af7b6ebd4e4d46282938146ba0184d9747ca657957c78118c10e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:10:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jdldr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:10:13Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-qtqms\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:11:11Z is after 2025-08-24T17:21:41Z" Feb 24 10:11:11 crc kubenswrapper[4985]: I0224 10:11:11.191205 4985 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f25863d6-c7fb-47ed-98bf-1e51a5d31280\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:08:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:08:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:09:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:09:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:08:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a122488c66088f525d614bb431c3211b6c4cc2e607703b5b4ab70ed998df6bfd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://80172a23d36c0c1ea92ff824608237fed22c6e46e43ac33db31a9aff8d181b2b\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-24T10:09:09Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0224 10:08:38.386655 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0224 10:08:38.388712 1 observer_polling.go:159] Starting file observer\\\\nI0224 10:08:38.420524 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0224 10:08:38.423674 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nF0224 10:09:09.020819 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-24T10:08:37Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:09:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://171f1effb015bd364e8afd406c496b6312f61c665b54b95b11fb5b203fca4c7e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:08:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://386ac8359b4df2fa31547b13090041850a839a61e549ff58ef8d5c33476e6e14\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:08:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d21b46ed1562aa1c67d31b5ed21686d633a5c8758c65d02566833c8c2b34d6fa\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:08:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:08:36Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:11:11Z is after 2025-08-24T17:21:41Z" Feb 24 10:11:11 crc kubenswrapper[4985]: I0224 10:11:11.204745 4985 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:11:11Z is after 2025-08-24T17:21:41Z" Feb 24 10:11:11 crc kubenswrapper[4985]: I0224 10:11:11.215537 4985 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:11:11Z is after 2025-08-24T17:21:41Z" Feb 24 10:11:11 crc kubenswrapper[4985]: I0224 10:11:11.228957 4985 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-q24bf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"731349d2-7b07-4bc9-81f8-c7d75bca842a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:11:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:11:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9ab2ed5b6b7b76ded6468be8fb5375bccf9f79a9b916fd1cc4fc2bc192140eb4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://85e965ad144b3440864b3c9de70aacd2a2e49c715dca204b0e4cfae65954cd9e\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-24T10:11:00Z\\\",\\\"message\\\":\\\"2026-02-24T10:10:15+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_482b94f0-6a0d-4f69-917f-1ca7bdfc7bb0\\\\n2026-02-24T10:10:15+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_482b94f0-6a0d-4f69-917f-1ca7bdfc7bb0 to /host/opt/cni/bin/\\\\n2026-02-24T10:10:15Z [verbose] multus-daemon started\\\\n2026-02-24T10:10:15Z [verbose] Readiness Indicator file check\\\\n2026-02-24T10:11:00Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-24T10:10:13Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:11:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5g9sh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:10:13Z\\\"}}\" for pod \"openshift-multus\"/\"multus-q24bf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:11:11Z is after 2025-08-24T17:21:41Z" Feb 24 10:11:11 crc kubenswrapper[4985]: I0224 10:11:11.240220 4985 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-kkmm8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c3a34c00-910b-400b-bc96-9d805e076b7c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://65682d7b02ccbc0c344ab0b8eaf55d227d0b6e4a04e56d1038d51aa30747b068\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:10:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2zg8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://52f2dbaf1dfa5fd40bbf8a0d82613a96b05afccc71653f6d1f24db6674950b58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:10:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2zg8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:10:13Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-kkmm8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:11:11Z is after 2025-08-24T17:21:41Z" Feb 24 10:11:11 crc kubenswrapper[4985]: I0224 10:11:11.263516 4985 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 24 10:11:11 crc kubenswrapper[4985]: I0224 10:11:11.263536 4985 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xkc65" Feb 24 10:11:11 crc kubenswrapper[4985]: I0224 10:11:11.263589 4985 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 24 10:11:11 crc kubenswrapper[4985]: I0224 10:11:11.263532 4985 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 24 10:11:11 crc kubenswrapper[4985]: E0224 10:11:11.263666 4985 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 24 10:11:11 crc kubenswrapper[4985]: E0224 10:11:11.263735 4985 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 24 10:11:11 crc kubenswrapper[4985]: E0224 10:11:11.263803 4985 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xkc65" podUID="d4340d1a-60cb-4240-87ba-1e468c9c41cf" Feb 24 10:11:11 crc kubenswrapper[4985]: E0224 10:11:11.263865 4985 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 24 10:11:11 crc kubenswrapper[4985]: E0224 10:11:11.375547 4985 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 24 10:11:11 crc kubenswrapper[4985]: I0224 10:11:11.992859 4985 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-27dpt_1b3986ef-e9be-43db-9350-ccc7dd3f713f/ovnkube-controller/3.log" Feb 24 10:11:12 crc kubenswrapper[4985]: I0224 10:11:12.000078 4985 scope.go:117] "RemoveContainer" containerID="df7ed5513fec3c391f8a44d1039dd8a9d0e7c587428383737726821dfab5d0b1" Feb 24 10:11:12 crc kubenswrapper[4985]: E0224 10:11:12.000468 4985 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-27dpt_openshift-ovn-kubernetes(1b3986ef-e9be-43db-9350-ccc7dd3f713f)\"" pod="openshift-ovn-kubernetes/ovnkube-node-27dpt" podUID="1b3986ef-e9be-43db-9350-ccc7dd3f713f" Feb 24 10:11:12 crc kubenswrapper[4985]: I0224 10:11:12.020525 4985 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f25863d6-c7fb-47ed-98bf-1e51a5d31280\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:08:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:08:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:09:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:09:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:08:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a122488c66088f525d614bb431c3211b6c4cc2e607703b5b4ab70ed998df6bfd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://80172a23d36c0c1ea92ff824608237fed22c6e46e43ac33db31a9aff8d181b2b\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-24T10:09:09Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0224 10:08:38.386655 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0224 10:08:38.388712 1 observer_polling.go:159] Starting file observer\\\\nI0224 10:08:38.420524 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0224 10:08:38.423674 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nF0224 10:09:09.020819 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-24T10:08:37Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:09:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://171f1effb015bd364e8afd406c496b6312f61c665b54b95b11fb5b203fca4c7e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:08:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://386ac8359b4df2fa31547b13090041850a839a61e549ff58ef8d5c33476e6e14\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:08:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d21b46ed1562aa1c67d31b5ed21686d633a5c8758c65d02566833c8c2b34d6fa\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:08:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:08:36Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:11:12Z is after 2025-08-24T17:21:41Z" Feb 24 10:11:12 crc kubenswrapper[4985]: I0224 10:11:12.037834 4985 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:11:12Z is after 2025-08-24T17:21:41Z" Feb 24 10:11:12 crc kubenswrapper[4985]: I0224 10:11:12.057652 4985 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:11:12Z is after 2025-08-24T17:21:41Z" Feb 24 10:11:12 crc kubenswrapper[4985]: I0224 10:11:12.078703 4985 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-q24bf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"731349d2-7b07-4bc9-81f8-c7d75bca842a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:11:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:11:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9ab2ed5b6b7b76ded6468be8fb5375bccf9f79a9b916fd1cc4fc2bc192140eb4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://85e965ad144b3440864b3c9de70aacd2a2e49c715dca204b0e4cfae65954cd9e\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-24T10:11:00Z\\\",\\\"message\\\":\\\"2026-02-24T10:10:15+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_482b94f0-6a0d-4f69-917f-1ca7bdfc7bb0\\\\n2026-02-24T10:10:15+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_482b94f0-6a0d-4f69-917f-1ca7bdfc7bb0 to /host/opt/cni/bin/\\\\n2026-02-24T10:10:15Z [verbose] multus-daemon started\\\\n2026-02-24T10:10:15Z [verbose] Readiness Indicator file check\\\\n2026-02-24T10:11:00Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-24T10:10:13Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:11:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5g9sh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:10:13Z\\\"}}\" for pod \"openshift-multus\"/\"multus-q24bf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:11:12Z is after 2025-08-24T17:21:41Z" Feb 24 10:11:12 crc kubenswrapper[4985]: I0224 10:11:12.093047 4985 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-kkmm8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c3a34c00-910b-400b-bc96-9d805e076b7c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://65682d7b02ccbc0c344ab0b8eaf55d227d0b6e4a04e56d1038d51aa30747b068\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:10:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2zg8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://52f2dbaf1dfa5fd40bbf8a0d82613a96b05afccc71653f6d1f24db6674950b58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:10:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2zg8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:10:13Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-kkmm8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:11:12Z is after 2025-08-24T17:21:41Z" Feb 24 10:11:12 crc kubenswrapper[4985]: I0224 10:11:12.111272 4985 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a6395bdd-0d8e-4572-ae86-695e87aff12e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:08:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:08:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:08:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f9134e7cde3d6c701f928934ed4de91ee13bac8bc62dde5df6fe535a219fb57\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:08:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba53e3e1dfd92b0155e60254f21e53e06da9929560934d7bd95624f4a2bb619c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:08:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d059ea80117284da8c7eb82e4cd878eabba8f9d96102965708b9f6e0f8e96eb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:08:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dea2225b9a0112117f2f0bd6038ef636ac20e0cfcf3d608b720ad7aa2cacfa7c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6ee8b871ab25b79ee67f4e5735673469d19cfb336696170957d2439b0bca13af\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-24T10:09:32Z\\\",\\\"message\\\":\\\"le observer\\\\nW0224 10:09:31.800132 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0224 10:09:31.800281 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0224 10:09:31.801147 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1923306337/tls.crt::/tmp/serving-cert-1923306337/tls.key\\\\\\\"\\\\nI0224 10:09:31.990905 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0224 10:09:31.993717 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0224 10:09:31.993747 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0224 10:09:31.993768 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0224 10:09:31.993775 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0224 10:09:31.997392 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0224 10:09:31.997421 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0224 10:09:31.997427 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0224 10:09:31.997436 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nI0224 10:09:31.997435 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0224 10:09:31.997440 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0224 10:09:31.997467 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0224 10:09:31.997470 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0224 10:09:31.999573 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-24T10:09:31Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:10:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5cf6a4886c046added0a4380f1ca023def94c15e9d5c0ed472c714632684f158\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:08:39Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8da8e0d5478ae714a89c61cf4ca4142bc433bca4d4e4749f069c706abfed8c8f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8da8e0d5478ae714a89c61cf4ca4142bc433bca4d4e4749f069c706abfed8c8f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T10:08:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T10:08:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:08:36Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:11:12Z is after 2025-08-24T17:21:41Z" Feb 24 10:11:12 crc kubenswrapper[4985]: I0224 10:11:12.137968 4985 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-xj9h5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c4bc690c-38e6-488d-97b2-9bf37a916fe4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ab4787f21b4cef49e96547a1d1e7105c79d01d7863afd7ea013f9a38cb40dd06\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:10:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svj5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5aa381e3ea366641ef4fa423fdd46f8afb44b1b2a2ccb6c754007456149b75e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5aa381e3ea366641ef4fa423fdd46f8afb44b1b2a2ccb6c754007456149b75e6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T10:10:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svj5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e071fe863c4e9eb6ebecb3d58d1c4a3c66577cbe46568a4d80a5de5663636539\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e071fe863c4e9eb6ebecb3d58d1c4a3c66577cbe46568a4d80a5de5663636539\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T10:10:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T10:10:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svj5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ac924a3bb0825085e397dbc788641894675e9556fd8aeadcbf9a24a65e1a076\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4ac924a3bb0825085e397dbc788641894675e9556fd8aeadcbf9a24a65e1a076\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T10:10:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T10:10:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svj5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://91acce3dbd19205b8faf23208b7d65c0c47d8156eaf6d7d3357ce1ca64a3bb88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://91acce3dbd19205b8faf23208b7d65c0c47d8156eaf6d7d3357ce1ca64a3bb88\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T10:10:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T10:10:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svj5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://81482396eb82b6ef2f09d03cf5e6bc785209394a50c63b199581fc70120bb9fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://81482396eb82b6ef2f09d03cf5e6bc785209394a50c63b199581fc70120bb9fd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T10:10:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T10:10:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svj5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://556b84b0e9e72b0c26c58defa2c030943daffa83d97b60293cc5b5627a6c9174\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://556b84b0e9e72b0c26c58defa2c030943daffa83d97b60293cc5b5627a6c9174\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T10:10:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T10:10:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svj5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:10:13Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-xj9h5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:11:12Z is after 2025-08-24T17:21:41Z" Feb 24 10:11:12 crc kubenswrapper[4985]: I0224 10:11:12.151398 4985 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-hq52w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"11c1c7b8-18df-4583-849f-76b62544344b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8546cf975b145528b7f94bd03c6570de0db1a059477da5795b7569250bde3ca5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:10:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kcfbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://14cb7316040a9cb3cb3236ed16ecc17eee99c6935a646b86604ea8285e1aab0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:10:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kcfbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:10:13Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-hq52w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:11:12Z is after 2025-08-24T17:21:41Z" Feb 24 10:11:12 crc kubenswrapper[4985]: I0224 10:11:12.163193 4985 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-jj7jq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4776cd42-9a5c-4c2e-a585-a1f4a49d2d6d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6f45d6400640ba9b59793abc9aee79bb1c32439ff377abdf8b4ec4d1cd7c336a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:10:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rvmf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:10:13Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-jj7jq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:11:12Z is after 2025-08-24T17:21:41Z" Feb 24 10:11:12 crc kubenswrapper[4985]: I0224 10:11:12.176141 4985 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:11:12Z is after 2025-08-24T17:21:41Z" Feb 24 10:11:12 crc kubenswrapper[4985]: I0224 10:11:12.191242 4985 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://137c9b42aa2109e16d0e273e3e7d116db30ed716c1e6f78fb8f4e87409e3f252\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:10:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://40736fa6ee59064a7fc5bed9b08671d094e7a11b6a8bf4ec45220a0d62156542\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:10:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:11:12Z is after 2025-08-24T17:21:41Z" Feb 24 10:11:12 crc kubenswrapper[4985]: I0224 10:11:12.203808 4985 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-qtqms" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c43e3252-1b22-48a0-8895-ad98fc88b7cf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://778506fd4d9af7b6ebd4e4d46282938146ba0184d9747ca657957c78118c10e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:10:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jdldr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:10:13Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-qtqms\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:11:12Z is after 2025-08-24T17:21:41Z" Feb 24 10:11:12 crc kubenswrapper[4985]: I0224 10:11:12.216196 4985 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"71a2eb97-3905-45f5-8448-ab8a6e27e0fb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:08:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:08:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:09:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:09:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:08:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://30ed8363bc5fff9f3b57a12ef0133eae4aa1447b427d1fc98e49e4155a154dab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:08:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://60b8a045a92dbfe12e3eed0e5ec7fda3284a16cb5a1ab0296ed1bd44c427f4b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:08:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6163846a3f1d508e3ea6bad31b1818569d4339c10d24284aa2360a98f4f78351\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:08:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6388ea35afa9218a9ebe327d6c6a9b35c7b92bd77a1f95e4e569ba759ec9aeb2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6388ea35afa9218a9ebe327d6c6a9b35c7b92bd77a1f95e4e569ba759ec9aeb2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T10:08:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T10:08:37Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:08:36Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:11:12Z is after 2025-08-24T17:21:41Z" Feb 24 10:11:12 crc kubenswrapper[4985]: I0224 10:11:12.236742 4985 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"42276a15-2a73-4687-91ad-d5ff65f8526d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:08:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:08:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:08:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:08:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:08:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://239bae44bdfe47737a87a7c7ffe103f138acfb89e1eb16c3901b25e8a0fa12e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:08:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aeb360380a66a7d4923e8db278af3a2024d5ed789e85e1427aaad600c2754224\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:08:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7968a5b1a705405cdca66b1c914475bd58b8c511980dfe312abd7ddf447d6375\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:08:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://813f41fed199f25206b4e475a175beaa3f3223c8b4f946e533f4290f64dbd069\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:08:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2a1fe7d1c64e5170ec0ce50343eb2a81b002e5b8bb4160c5ab4abafcc4ece67b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:08:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f5db1ab98b50bb081b07ffa0a8660ef886047d941d9b8509f484011950d0a94a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f5db1ab98b50bb081b07ffa0a8660ef886047d941d9b8509f484011950d0a94a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T10:08:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T10:08:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9edf4a1146efaf26193d3f8c63eb95cd945e7be5d36dc825d24053417ac5b1ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9edf4a1146efaf26193d3f8c63eb95cd945e7be5d36dc825d24053417ac5b1ac\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T10:08:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T10:08:38Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://2592e50d0dc75b9c89a94486e27887c6487e21a72ad53055b1b9af0cdfd73ec9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2592e50d0dc75b9c89a94486e27887c6487e21a72ad53055b1b9af0cdfd73ec9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T10:08:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T10:08:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:08:36Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:11:12Z is after 2025-08-24T17:21:41Z" Feb 24 10:11:12 crc kubenswrapper[4985]: I0224 10:11:12.250146 4985 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c9e0873cd1521fe3cd14005f22118db4616043670f4b895435e71ddec9d82a0e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:10:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:11:12Z is after 2025-08-24T17:21:41Z" Feb 24 10:11:12 crc kubenswrapper[4985]: I0224 10:11:12.261581 4985 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:16Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:16Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e2f9d78f6a52c55d1f70fa119f84fdb11505d511d394a8ed61a298e79dfb2dcd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:10:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:11:12Z is after 2025-08-24T17:21:41Z" Feb 24 10:11:12 crc kubenswrapper[4985]: I0224 10:11:12.274433 4985 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-xkc65" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d4340d1a-60cb-4240-87ba-1e468c9c41cf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lvcfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lvcfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:10:13Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-xkc65\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:11:12Z is after 2025-08-24T17:21:41Z" Feb 24 10:11:12 crc kubenswrapper[4985]: I0224 10:11:12.295263 4985 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-27dpt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1b3986ef-e9be-43db-9350-ccc7dd3f713f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://97f91c2ddd00ceb58ea5691c7cb057bd7cfe8add5fd1dfd334f39b9def30db0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:10:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9n6c9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://af211a7b5dbd1e1f86f340d63afe03ad2f9691d166810837cc5988a24ad4f323\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:10:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9n6c9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3d76cc1e57df7809a67ce69b6690fd8cbe0ef9c364406b03bbb37bd110d65866\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:10:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9n6c9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cb7b5b1f27320e4bc00e48154c3bb41b1153e36c39a404cfc1d4a246f5909c5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:10:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9n6c9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e04b766ff2b4eb366ace7cbf978ef84bfa922494ec3d28f4cbd75efe1cb54f1e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:10:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9n6c9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7630d2e046fcee78924010a8de1aefcdf7c24ef4e70c4ea89c193983a0d88a61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:10:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9n6c9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://df7ed5513fec3c391f8a44d1039dd8a9d0e7c587428383737726821dfab5d0b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://df7ed5513fec3c391f8a44d1039dd8a9d0e7c587428383737726821dfab5d0b1\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-24T10:11:10Z\\\",\\\"message\\\":\\\"Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI0224 10:11:10.206860 7359 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0224 10:11:10.207012 7359 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0224 10:11:10.207040 7359 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0224 10:11:10.208015 7359 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0224 10:11:10.208045 7359 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0224 10:11:10.208051 7359 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0224 10:11:10.208063 7359 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0224 10:11:10.208092 7359 factory.go:656] Stopping watch factory\\\\nI0224 10:11:10.208106 7359 ovnkube.go:599] Stopped ovnkube\\\\nI0224 10:11:10.208139 7359 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0224 10:11:10.208155 7359 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0224 10:11:10.208162 7359 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0224 1\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-24T10:11:09Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-27dpt_openshift-ovn-kubernetes(1b3986ef-e9be-43db-9350-ccc7dd3f713f)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9n6c9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e3faa6c1361aca1d102b6dbfaf38fa3094a5c2334713ce70a8bf2f58f19d6392\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:10:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9n6c9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8de70126733b3634f4e3c1576c3095c089e9628a6147ce8ea79fa5242a2eb920\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8de70126733b3634f4e3c1576c3095c089e9628a6147ce8ea79fa5242a2eb920\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T10:10:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9n6c9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:10:13Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-27dpt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:11:12Z is after 2025-08-24T17:21:41Z" Feb 24 10:11:13 crc kubenswrapper[4985]: I0224 10:11:13.263870 4985 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 24 10:11:13 crc kubenswrapper[4985]: I0224 10:11:13.263983 4985 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xkc65" Feb 24 10:11:13 crc kubenswrapper[4985]: I0224 10:11:13.264033 4985 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 24 10:11:13 crc kubenswrapper[4985]: E0224 10:11:13.264224 4985 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 24 10:11:13 crc kubenswrapper[4985]: I0224 10:11:13.264279 4985 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 24 10:11:13 crc kubenswrapper[4985]: E0224 10:11:13.264445 4985 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xkc65" podUID="d4340d1a-60cb-4240-87ba-1e468c9c41cf" Feb 24 10:11:13 crc kubenswrapper[4985]: E0224 10:11:13.264616 4985 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 24 10:11:13 crc kubenswrapper[4985]: E0224 10:11:13.264792 4985 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 24 10:11:15 crc kubenswrapper[4985]: I0224 10:11:15.264503 4985 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 24 10:11:15 crc kubenswrapper[4985]: E0224 10:11:15.264976 4985 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 24 10:11:15 crc kubenswrapper[4985]: I0224 10:11:15.264571 4985 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 24 10:11:15 crc kubenswrapper[4985]: I0224 10:11:15.264527 4985 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xkc65" Feb 24 10:11:15 crc kubenswrapper[4985]: E0224 10:11:15.265070 4985 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 24 10:11:15 crc kubenswrapper[4985]: I0224 10:11:15.264613 4985 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 24 10:11:15 crc kubenswrapper[4985]: E0224 10:11:15.265204 4985 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xkc65" podUID="d4340d1a-60cb-4240-87ba-1e468c9c41cf" Feb 24 10:11:15 crc kubenswrapper[4985]: E0224 10:11:15.265383 4985 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 24 10:11:16 crc kubenswrapper[4985]: I0224 10:11:16.289229 4985 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-crc"] Feb 24 10:11:16 crc kubenswrapper[4985]: I0224 10:11:16.299316 4985 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a6395bdd-0d8e-4572-ae86-695e87aff12e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:08:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:08:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:08:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f9134e7cde3d6c701f928934ed4de91ee13bac8bc62dde5df6fe535a219fb57\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:08:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba53e3e1dfd92b0155e60254f21e53e06da9929560934d7bd95624f4a2bb619c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:08:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d059ea80117284da8c7eb82e4cd878eabba8f9d96102965708b9f6e0f8e96eb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:08:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dea2225b9a0112117f2f0bd6038ef636ac20e0cfcf3d608b720ad7aa2cacfa7c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6ee8b871ab25b79ee67f4e5735673469d19cfb336696170957d2439b0bca13af\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-24T10:09:32Z\\\",\\\"message\\\":\\\"le observer\\\\nW0224 10:09:31.800132 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0224 10:09:31.800281 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0224 10:09:31.801147 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1923306337/tls.crt::/tmp/serving-cert-1923306337/tls.key\\\\\\\"\\\\nI0224 10:09:31.990905 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0224 10:09:31.993717 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0224 10:09:31.993747 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0224 10:09:31.993768 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0224 10:09:31.993775 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0224 10:09:31.997392 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0224 10:09:31.997421 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0224 10:09:31.997427 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0224 10:09:31.997436 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nI0224 10:09:31.997435 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0224 10:09:31.997440 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0224 10:09:31.997467 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0224 10:09:31.997470 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0224 10:09:31.999573 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-24T10:09:31Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:10:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5cf6a4886c046added0a4380f1ca023def94c15e9d5c0ed472c714632684f158\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:08:39Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8da8e0d5478ae714a89c61cf4ca4142bc433bca4d4e4749f069c706abfed8c8f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8da8e0d5478ae714a89c61cf4ca4142bc433bca4d4e4749f069c706abfed8c8f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T10:08:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T10:08:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:08:36Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:11:16Z is after 2025-08-24T17:21:41Z" Feb 24 10:11:16 crc kubenswrapper[4985]: I0224 10:11:16.320790 4985 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-xj9h5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c4bc690c-38e6-488d-97b2-9bf37a916fe4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ab4787f21b4cef49e96547a1d1e7105c79d01d7863afd7ea013f9a38cb40dd06\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:10:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svj5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5aa381e3ea366641ef4fa423fdd46f8afb44b1b2a2ccb6c754007456149b75e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5aa381e3ea366641ef4fa423fdd46f8afb44b1b2a2ccb6c754007456149b75e6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T10:10:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svj5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e071fe863c4e9eb6ebecb3d58d1c4a3c66577cbe46568a4d80a5de5663636539\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e071fe863c4e9eb6ebecb3d58d1c4a3c66577cbe46568a4d80a5de5663636539\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T10:10:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T10:10:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svj5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ac924a3bb0825085e397dbc788641894675e9556fd8aeadcbf9a24a65e1a076\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4ac924a3bb0825085e397dbc788641894675e9556fd8aeadcbf9a24a65e1a076\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T10:10:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T10:10:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svj5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://91acce3dbd19205b8faf23208b7d65c0c47d8156eaf6d7d3357ce1ca64a3bb88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://91acce3dbd19205b8faf23208b7d65c0c47d8156eaf6d7d3357ce1ca64a3bb88\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T10:10:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T10:10:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svj5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://81482396eb82b6ef2f09d03cf5e6bc785209394a50c63b199581fc70120bb9fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://81482396eb82b6ef2f09d03cf5e6bc785209394a50c63b199581fc70120bb9fd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T10:10:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T10:10:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svj5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://556b84b0e9e72b0c26c58defa2c030943daffa83d97b60293cc5b5627a6c9174\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://556b84b0e9e72b0c26c58defa2c030943daffa83d97b60293cc5b5627a6c9174\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T10:10:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T10:10:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svj5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:10:13Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-xj9h5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:11:16Z is after 2025-08-24T17:21:41Z" Feb 24 10:11:16 crc kubenswrapper[4985]: I0224 10:11:16.337558 4985 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-hq52w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"11c1c7b8-18df-4583-849f-76b62544344b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8546cf975b145528b7f94bd03c6570de0db1a059477da5795b7569250bde3ca5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:10:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kcfbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://14cb7316040a9cb3cb3236ed16ecc17eee99c6935a646b86604ea8285e1aab0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:10:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kcfbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:10:13Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-hq52w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:11:16Z is after 2025-08-24T17:21:41Z" Feb 24 10:11:16 crc kubenswrapper[4985]: I0224 10:11:16.349835 4985 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-jj7jq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4776cd42-9a5c-4c2e-a585-a1f4a49d2d6d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6f45d6400640ba9b59793abc9aee79bb1c32439ff377abdf8b4ec4d1cd7c336a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:10:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rvmf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:10:13Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-jj7jq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:11:16Z is after 2025-08-24T17:21:41Z" Feb 24 10:11:16 crc kubenswrapper[4985]: I0224 10:11:16.367588 4985 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:11:16Z is after 2025-08-24T17:21:41Z" Feb 24 10:11:16 crc kubenswrapper[4985]: E0224 10:11:16.377752 4985 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 24 10:11:16 crc kubenswrapper[4985]: I0224 10:11:16.383315 4985 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://137c9b42aa2109e16d0e273e3e7d116db30ed716c1e6f78fb8f4e87409e3f252\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:10:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://40736fa6ee59064a7fc5bed9b08671d094e7a11b6a8bf4ec45220a0d62156542\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:10:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:11:16Z is after 2025-08-24T17:21:41Z" Feb 24 10:11:16 crc kubenswrapper[4985]: I0224 10:11:16.403614 4985 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-27dpt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1b3986ef-e9be-43db-9350-ccc7dd3f713f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://97f91c2ddd00ceb58ea5691c7cb057bd7cfe8add5fd1dfd334f39b9def30db0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:10:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9n6c9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://af211a7b5dbd1e1f86f340d63afe03ad2f9691d166810837cc5988a24ad4f323\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:10:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9n6c9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3d76cc1e57df7809a67ce69b6690fd8cbe0ef9c364406b03bbb37bd110d65866\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:10:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9n6c9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cb7b5b1f27320e4bc00e48154c3bb41b1153e36c39a404cfc1d4a246f5909c5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:10:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9n6c9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e04b766ff2b4eb366ace7cbf978ef84bfa922494ec3d28f4cbd75efe1cb54f1e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:10:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9n6c9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7630d2e046fcee78924010a8de1aefcdf7c24ef4e70c4ea89c193983a0d88a61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:10:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9n6c9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://df7ed5513fec3c391f8a44d1039dd8a9d0e7c587428383737726821dfab5d0b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://df7ed5513fec3c391f8a44d1039dd8a9d0e7c587428383737726821dfab5d0b1\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-24T10:11:10Z\\\",\\\"message\\\":\\\"Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI0224 10:11:10.206860 7359 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0224 10:11:10.207012 7359 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0224 10:11:10.207040 7359 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0224 10:11:10.208015 7359 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0224 10:11:10.208045 7359 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0224 10:11:10.208051 7359 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0224 10:11:10.208063 7359 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0224 10:11:10.208092 7359 factory.go:656] Stopping watch factory\\\\nI0224 10:11:10.208106 7359 ovnkube.go:599] Stopped ovnkube\\\\nI0224 10:11:10.208139 7359 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0224 10:11:10.208155 7359 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0224 10:11:10.208162 7359 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0224 1\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-24T10:11:09Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-27dpt_openshift-ovn-kubernetes(1b3986ef-e9be-43db-9350-ccc7dd3f713f)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9n6c9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e3faa6c1361aca1d102b6dbfaf38fa3094a5c2334713ce70a8bf2f58f19d6392\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:10:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9n6c9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8de70126733b3634f4e3c1576c3095c089e9628a6147ce8ea79fa5242a2eb920\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8de70126733b3634f4e3c1576c3095c089e9628a6147ce8ea79fa5242a2eb920\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T10:10:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9n6c9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:10:13Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-27dpt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:11:16Z is after 2025-08-24T17:21:41Z" Feb 24 10:11:16 crc kubenswrapper[4985]: I0224 10:11:16.413559 4985 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-qtqms" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c43e3252-1b22-48a0-8895-ad98fc88b7cf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://778506fd4d9af7b6ebd4e4d46282938146ba0184d9747ca657957c78118c10e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:10:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jdldr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:10:13Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-qtqms\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:11:16Z is after 2025-08-24T17:21:41Z" Feb 24 10:11:16 crc kubenswrapper[4985]: I0224 10:11:16.425652 4985 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"71a2eb97-3905-45f5-8448-ab8a6e27e0fb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:08:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:08:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:09:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:09:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:08:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://30ed8363bc5fff9f3b57a12ef0133eae4aa1447b427d1fc98e49e4155a154dab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:08:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://60b8a045a92dbfe12e3eed0e5ec7fda3284a16cb5a1ab0296ed1bd44c427f4b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:08:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6163846a3f1d508e3ea6bad31b1818569d4339c10d24284aa2360a98f4f78351\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:08:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6388ea35afa9218a9ebe327d6c6a9b35c7b92bd77a1f95e4e569ba759ec9aeb2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6388ea35afa9218a9ebe327d6c6a9b35c7b92bd77a1f95e4e569ba759ec9aeb2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T10:08:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T10:08:37Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:08:36Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:11:16Z is after 2025-08-24T17:21:41Z" Feb 24 10:11:16 crc kubenswrapper[4985]: I0224 10:11:16.443808 4985 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"42276a15-2a73-4687-91ad-d5ff65f8526d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:08:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:08:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:08:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:08:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:08:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://239bae44bdfe47737a87a7c7ffe103f138acfb89e1eb16c3901b25e8a0fa12e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:08:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aeb360380a66a7d4923e8db278af3a2024d5ed789e85e1427aaad600c2754224\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:08:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7968a5b1a705405cdca66b1c914475bd58b8c511980dfe312abd7ddf447d6375\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:08:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://813f41fed199f25206b4e475a175beaa3f3223c8b4f946e533f4290f64dbd069\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:08:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2a1fe7d1c64e5170ec0ce50343eb2a81b002e5b8bb4160c5ab4abafcc4ece67b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:08:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f5db1ab98b50bb081b07ffa0a8660ef886047d941d9b8509f484011950d0a94a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f5db1ab98b50bb081b07ffa0a8660ef886047d941d9b8509f484011950d0a94a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T10:08:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T10:08:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9edf4a1146efaf26193d3f8c63eb95cd945e7be5d36dc825d24053417ac5b1ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9edf4a1146efaf26193d3f8c63eb95cd945e7be5d36dc825d24053417ac5b1ac\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T10:08:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T10:08:38Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://2592e50d0dc75b9c89a94486e27887c6487e21a72ad53055b1b9af0cdfd73ec9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2592e50d0dc75b9c89a94486e27887c6487e21a72ad53055b1b9af0cdfd73ec9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T10:08:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T10:08:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:08:36Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:11:16Z is after 2025-08-24T17:21:41Z" Feb 24 10:11:16 crc kubenswrapper[4985]: I0224 10:11:16.457788 4985 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c9e0873cd1521fe3cd14005f22118db4616043670f4b895435e71ddec9d82a0e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:10:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:11:16Z is after 2025-08-24T17:21:41Z" Feb 24 10:11:16 crc kubenswrapper[4985]: I0224 10:11:16.469728 4985 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:16Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:16Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e2f9d78f6a52c55d1f70fa119f84fdb11505d511d394a8ed61a298e79dfb2dcd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:10:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:11:16Z is after 2025-08-24T17:21:41Z" Feb 24 10:11:16 crc kubenswrapper[4985]: I0224 10:11:16.479927 4985 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-xkc65" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d4340d1a-60cb-4240-87ba-1e468c9c41cf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lvcfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lvcfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:10:13Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-xkc65\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:11:16Z is after 2025-08-24T17:21:41Z" Feb 24 10:11:16 crc kubenswrapper[4985]: I0224 10:11:16.490925 4985 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f25863d6-c7fb-47ed-98bf-1e51a5d31280\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:08:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:08:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:09:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:09:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:08:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a122488c66088f525d614bb431c3211b6c4cc2e607703b5b4ab70ed998df6bfd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://80172a23d36c0c1ea92ff824608237fed22c6e46e43ac33db31a9aff8d181b2b\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-24T10:09:09Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0224 10:08:38.386655 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0224 10:08:38.388712 1 observer_polling.go:159] Starting file observer\\\\nI0224 10:08:38.420524 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0224 10:08:38.423674 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nF0224 10:09:09.020819 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-24T10:08:37Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:09:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://171f1effb015bd364e8afd406c496b6312f61c665b54b95b11fb5b203fca4c7e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:08:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://386ac8359b4df2fa31547b13090041850a839a61e549ff58ef8d5c33476e6e14\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:08:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d21b46ed1562aa1c67d31b5ed21686d633a5c8758c65d02566833c8c2b34d6fa\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:08:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:08:36Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:11:16Z is after 2025-08-24T17:21:41Z" Feb 24 10:11:16 crc kubenswrapper[4985]: I0224 10:11:16.502469 4985 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:11:16Z is after 2025-08-24T17:21:41Z" Feb 24 10:11:16 crc kubenswrapper[4985]: I0224 10:11:16.512637 4985 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:11:16Z is after 2025-08-24T17:21:41Z" Feb 24 10:11:16 crc kubenswrapper[4985]: I0224 10:11:16.522962 4985 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-q24bf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"731349d2-7b07-4bc9-81f8-c7d75bca842a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:11:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:11:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9ab2ed5b6b7b76ded6468be8fb5375bccf9f79a9b916fd1cc4fc2bc192140eb4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://85e965ad144b3440864b3c9de70aacd2a2e49c715dca204b0e4cfae65954cd9e\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-24T10:11:00Z\\\",\\\"message\\\":\\\"2026-02-24T10:10:15+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_482b94f0-6a0d-4f69-917f-1ca7bdfc7bb0\\\\n2026-02-24T10:10:15+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_482b94f0-6a0d-4f69-917f-1ca7bdfc7bb0 to /host/opt/cni/bin/\\\\n2026-02-24T10:10:15Z [verbose] multus-daemon started\\\\n2026-02-24T10:10:15Z [verbose] Readiness Indicator file check\\\\n2026-02-24T10:11:00Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-24T10:10:13Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:11:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5g9sh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:10:13Z\\\"}}\" for pod \"openshift-multus\"/\"multus-q24bf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:11:16Z is after 2025-08-24T17:21:41Z" Feb 24 10:11:16 crc kubenswrapper[4985]: I0224 10:11:16.533399 4985 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-kkmm8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c3a34c00-910b-400b-bc96-9d805e076b7c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:10:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://65682d7b02ccbc0c344ab0b8eaf55d227d0b6e4a04e56d1038d51aa30747b068\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:10:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2zg8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://52f2dbaf1dfa5fd40bbf8a0d82613a96b05afccc71653f6d1f24db6674950b58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:10:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2zg8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:10:13Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-kkmm8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:11:16Z is after 2025-08-24T17:21:41Z" Feb 24 10:11:17 crc kubenswrapper[4985]: I0224 10:11:17.201854 4985 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 24 10:11:17 crc kubenswrapper[4985]: E0224 10:11:17.202070 4985 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-24 10:12:21.202039145 +0000 UTC m=+225.676231715 (durationBeforeRetry 1m4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 24 10:11:17 crc kubenswrapper[4985]: I0224 10:11:17.202256 4985 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 24 10:11:17 crc kubenswrapper[4985]: I0224 10:11:17.202318 4985 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 24 10:11:17 crc kubenswrapper[4985]: I0224 10:11:17.202364 4985 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 24 10:11:17 crc kubenswrapper[4985]: E0224 10:11:17.202486 4985 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Feb 24 10:11:17 crc kubenswrapper[4985]: E0224 10:11:17.202517 4985 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 24 10:11:17 crc kubenswrapper[4985]: E0224 10:11:17.202532 4985 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 24 10:11:17 crc kubenswrapper[4985]: E0224 10:11:17.202558 4985 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-24 10:12:21.202546369 +0000 UTC m=+225.676738939 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Feb 24 10:11:17 crc kubenswrapper[4985]: E0224 10:11:17.202572 4985 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 24 10:11:17 crc kubenswrapper[4985]: E0224 10:11:17.202581 4985 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-24 10:12:21.20256634 +0000 UTC m=+225.676758910 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 24 10:11:17 crc kubenswrapper[4985]: E0224 10:11:17.202600 4985 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 24 10:11:17 crc kubenswrapper[4985]: E0224 10:11:17.202693 4985 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-02-24 10:12:21.202670473 +0000 UTC m=+225.676863073 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 24 10:11:17 crc kubenswrapper[4985]: I0224 10:11:17.264219 4985 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 24 10:11:17 crc kubenswrapper[4985]: E0224 10:11:17.264419 4985 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 24 10:11:17 crc kubenswrapper[4985]: I0224 10:11:17.264514 4985 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 24 10:11:17 crc kubenswrapper[4985]: E0224 10:11:17.264623 4985 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 24 10:11:17 crc kubenswrapper[4985]: I0224 10:11:17.264693 4985 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 24 10:11:17 crc kubenswrapper[4985]: I0224 10:11:17.264795 4985 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xkc65" Feb 24 10:11:17 crc kubenswrapper[4985]: E0224 10:11:17.264919 4985 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 24 10:11:17 crc kubenswrapper[4985]: E0224 10:11:17.265023 4985 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xkc65" podUID="d4340d1a-60cb-4240-87ba-1e468c9c41cf" Feb 24 10:11:17 crc kubenswrapper[4985]: I0224 10:11:17.303159 4985 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 24 10:11:17 crc kubenswrapper[4985]: I0224 10:11:17.303222 4985 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d4340d1a-60cb-4240-87ba-1e468c9c41cf-metrics-certs\") pod \"network-metrics-daemon-xkc65\" (UID: \"d4340d1a-60cb-4240-87ba-1e468c9c41cf\") " pod="openshift-multus/network-metrics-daemon-xkc65" Feb 24 10:11:17 crc kubenswrapper[4985]: E0224 10:11:17.303357 4985 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Feb 24 10:11:17 crc kubenswrapper[4985]: E0224 10:11:17.303432 4985 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d4340d1a-60cb-4240-87ba-1e468c9c41cf-metrics-certs podName:d4340d1a-60cb-4240-87ba-1e468c9c41cf nodeName:}" failed. No retries permitted until 2026-02-24 10:12:21.303410262 +0000 UTC m=+225.777602872 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/d4340d1a-60cb-4240-87ba-1e468c9c41cf-metrics-certs") pod "network-metrics-daemon-xkc65" (UID: "d4340d1a-60cb-4240-87ba-1e468c9c41cf") : object "openshift-multus"/"metrics-daemon-secret" not registered Feb 24 10:11:17 crc kubenswrapper[4985]: E0224 10:11:17.303482 4985 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 24 10:11:17 crc kubenswrapper[4985]: E0224 10:11:17.303528 4985 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 24 10:11:17 crc kubenswrapper[4985]: E0224 10:11:17.303574 4985 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 24 10:11:17 crc kubenswrapper[4985]: E0224 10:11:17.303673 4985 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-02-24 10:12:21.30364644 +0000 UTC m=+225.777839050 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 24 10:11:18 crc kubenswrapper[4985]: I0224 10:11:18.721590 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:11:18 crc kubenswrapper[4985]: I0224 10:11:18.721647 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:11:18 crc kubenswrapper[4985]: I0224 10:11:18.721659 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:11:18 crc kubenswrapper[4985]: I0224 10:11:18.721679 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 10:11:18 crc kubenswrapper[4985]: I0224 10:11:18.721698 4985 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T10:11:18Z","lastTransitionTime":"2026-02-24T10:11:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 10:11:18 crc kubenswrapper[4985]: E0224 10:11:18.741145 4985 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T10:11:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T10:11:18Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T10:11:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T10:11:18Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T10:11:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T10:11:18Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T10:11:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T10:11:18Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"6fa96d71-6980-4f45-b02a-32602082091f\\\",\\\"systemUUID\\\":\\\"77848161-2cfb-4f0a-8e60-0ac9426710fd\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:11:18Z is after 2025-08-24T17:21:41Z" Feb 24 10:11:18 crc kubenswrapper[4985]: I0224 10:11:18.745820 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:11:18 crc kubenswrapper[4985]: I0224 10:11:18.745871 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:11:18 crc kubenswrapper[4985]: I0224 10:11:18.745907 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:11:18 crc kubenswrapper[4985]: I0224 10:11:18.745926 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 10:11:18 crc kubenswrapper[4985]: I0224 10:11:18.745967 4985 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T10:11:18Z","lastTransitionTime":"2026-02-24T10:11:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 10:11:18 crc kubenswrapper[4985]: E0224 10:11:18.764101 4985 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T10:11:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T10:11:18Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T10:11:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T10:11:18Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T10:11:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T10:11:18Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T10:11:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T10:11:18Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"6fa96d71-6980-4f45-b02a-32602082091f\\\",\\\"systemUUID\\\":\\\"77848161-2cfb-4f0a-8e60-0ac9426710fd\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:11:18Z is after 2025-08-24T17:21:41Z" Feb 24 10:11:18 crc kubenswrapper[4985]: I0224 10:11:18.776561 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:11:18 crc kubenswrapper[4985]: I0224 10:11:18.776630 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:11:18 crc kubenswrapper[4985]: I0224 10:11:18.776650 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:11:18 crc kubenswrapper[4985]: I0224 10:11:18.776686 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 10:11:18 crc kubenswrapper[4985]: I0224 10:11:18.776755 4985 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T10:11:18Z","lastTransitionTime":"2026-02-24T10:11:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 10:11:18 crc kubenswrapper[4985]: E0224 10:11:18.798446 4985 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T10:11:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T10:11:18Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T10:11:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T10:11:18Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T10:11:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T10:11:18Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T10:11:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T10:11:18Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"6fa96d71-6980-4f45-b02a-32602082091f\\\",\\\"systemUUID\\\":\\\"77848161-2cfb-4f0a-8e60-0ac9426710fd\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:11:18Z is after 2025-08-24T17:21:41Z" Feb 24 10:11:18 crc kubenswrapper[4985]: I0224 10:11:18.803269 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:11:18 crc kubenswrapper[4985]: I0224 10:11:18.803327 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:11:18 crc kubenswrapper[4985]: I0224 10:11:18.803402 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:11:18 crc kubenswrapper[4985]: I0224 10:11:18.803439 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 10:11:18 crc kubenswrapper[4985]: I0224 10:11:18.803463 4985 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T10:11:18Z","lastTransitionTime":"2026-02-24T10:11:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 10:11:18 crc kubenswrapper[4985]: E0224 10:11:18.818395 4985 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T10:11:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T10:11:18Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T10:11:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T10:11:18Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T10:11:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T10:11:18Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T10:11:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T10:11:18Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"6fa96d71-6980-4f45-b02a-32602082091f\\\",\\\"systemUUID\\\":\\\"77848161-2cfb-4f0a-8e60-0ac9426710fd\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:11:18Z is after 2025-08-24T17:21:41Z" Feb 24 10:11:18 crc kubenswrapper[4985]: I0224 10:11:18.822816 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:11:18 crc kubenswrapper[4985]: I0224 10:11:18.822915 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:11:18 crc kubenswrapper[4985]: I0224 10:11:18.822944 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:11:18 crc kubenswrapper[4985]: I0224 10:11:18.822981 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 10:11:18 crc kubenswrapper[4985]: I0224 10:11:18.823006 4985 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T10:11:18Z","lastTransitionTime":"2026-02-24T10:11:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 10:11:18 crc kubenswrapper[4985]: E0224 10:11:18.841134 4985 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T10:11:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T10:11:18Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T10:11:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T10:11:18Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T10:11:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T10:11:18Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T10:11:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T10:11:18Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"6fa96d71-6980-4f45-b02a-32602082091f\\\",\\\"systemUUID\\\":\\\"77848161-2cfb-4f0a-8e60-0ac9426710fd\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:11:18Z is after 2025-08-24T17:21:41Z" Feb 24 10:11:18 crc kubenswrapper[4985]: E0224 10:11:18.841427 4985 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Feb 24 10:11:19 crc kubenswrapper[4985]: I0224 10:11:19.264573 4985 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 24 10:11:19 crc kubenswrapper[4985]: I0224 10:11:19.265165 4985 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 24 10:11:19 crc kubenswrapper[4985]: E0224 10:11:19.265375 4985 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 24 10:11:19 crc kubenswrapper[4985]: I0224 10:11:19.265468 4985 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 24 10:11:19 crc kubenswrapper[4985]: E0224 10:11:19.265633 4985 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 24 10:11:19 crc kubenswrapper[4985]: E0224 10:11:19.265802 4985 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 24 10:11:19 crc kubenswrapper[4985]: I0224 10:11:19.265819 4985 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xkc65" Feb 24 10:11:19 crc kubenswrapper[4985]: E0224 10:11:19.266002 4985 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xkc65" podUID="d4340d1a-60cb-4240-87ba-1e468c9c41cf" Feb 24 10:11:21 crc kubenswrapper[4985]: I0224 10:11:21.264371 4985 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 24 10:11:21 crc kubenswrapper[4985]: I0224 10:11:21.264419 4985 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 24 10:11:21 crc kubenswrapper[4985]: I0224 10:11:21.264390 4985 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 24 10:11:21 crc kubenswrapper[4985]: E0224 10:11:21.264574 4985 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 24 10:11:21 crc kubenswrapper[4985]: E0224 10:11:21.264660 4985 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 24 10:11:21 crc kubenswrapper[4985]: E0224 10:11:21.264761 4985 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 24 10:11:21 crc kubenswrapper[4985]: I0224 10:11:21.266197 4985 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xkc65" Feb 24 10:11:21 crc kubenswrapper[4985]: E0224 10:11:21.266710 4985 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xkc65" podUID="d4340d1a-60cb-4240-87ba-1e468c9c41cf" Feb 24 10:11:21 crc kubenswrapper[4985]: E0224 10:11:21.379275 4985 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 24 10:11:23 crc kubenswrapper[4985]: I0224 10:11:23.264218 4985 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 24 10:11:23 crc kubenswrapper[4985]: I0224 10:11:23.264288 4985 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xkc65" Feb 24 10:11:23 crc kubenswrapper[4985]: I0224 10:11:23.264297 4985 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 24 10:11:23 crc kubenswrapper[4985]: E0224 10:11:23.264407 4985 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 24 10:11:23 crc kubenswrapper[4985]: I0224 10:11:23.264226 4985 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 24 10:11:23 crc kubenswrapper[4985]: E0224 10:11:23.264536 4985 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xkc65" podUID="d4340d1a-60cb-4240-87ba-1e468c9c41cf" Feb 24 10:11:23 crc kubenswrapper[4985]: E0224 10:11:23.264672 4985 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 24 10:11:23 crc kubenswrapper[4985]: E0224 10:11:23.264783 4985 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 24 10:11:25 crc kubenswrapper[4985]: I0224 10:11:25.264382 4985 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xkc65" Feb 24 10:11:25 crc kubenswrapper[4985]: E0224 10:11:25.264527 4985 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xkc65" podUID="d4340d1a-60cb-4240-87ba-1e468c9c41cf" Feb 24 10:11:25 crc kubenswrapper[4985]: I0224 10:11:25.264635 4985 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 24 10:11:25 crc kubenswrapper[4985]: I0224 10:11:25.264766 4985 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 24 10:11:25 crc kubenswrapper[4985]: I0224 10:11:25.264648 4985 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 24 10:11:25 crc kubenswrapper[4985]: E0224 10:11:25.264860 4985 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 24 10:11:25 crc kubenswrapper[4985]: E0224 10:11:25.264964 4985 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 24 10:11:25 crc kubenswrapper[4985]: E0224 10:11:25.265075 4985 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 24 10:11:26 crc kubenswrapper[4985]: I0224 10:11:26.299426 4985 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-kkmm8" podStartSLOduration=124.299411974 podStartE2EDuration="2m4.299411974s" podCreationTimestamp="2026-02-24 10:09:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-24 10:11:26.298942871 +0000 UTC m=+170.773135431" watchObservedRunningTime="2026-02-24 10:11:26.299411974 +0000 UTC m=+170.773604534" Feb 24 10:11:26 crc kubenswrapper[4985]: I0224 10:11:26.337932 4985 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podStartSLOduration=42.337871698 podStartE2EDuration="42.337871698s" podCreationTimestamp="2026-02-24 10:10:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-24 10:11:26.316743513 +0000 UTC m=+170.790936073" watchObservedRunningTime="2026-02-24 10:11:26.337871698 +0000 UTC m=+170.812064298" Feb 24 10:11:26 crc kubenswrapper[4985]: E0224 10:11:26.379632 4985 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 24 10:11:26 crc kubenswrapper[4985]: I0224 10:11:26.397085 4985 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-q24bf" podStartSLOduration=124.397067756 podStartE2EDuration="2m4.397067756s" podCreationTimestamp="2026-02-24 10:09:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-24 10:11:26.374575842 +0000 UTC m=+170.848768412" watchObservedRunningTime="2026-02-24 10:11:26.397067756 +0000 UTC m=+170.871260316" Feb 24 10:11:26 crc kubenswrapper[4985]: I0224 10:11:26.422581 4985 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=72.422557384 podStartE2EDuration="1m12.422557384s" podCreationTimestamp="2026-02-24 10:10:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-24 10:11:26.397597952 +0000 UTC m=+170.871790512" watchObservedRunningTime="2026-02-24 10:11:26.422557384 +0000 UTC m=+170.896749944" Feb 24 10:11:26 crc kubenswrapper[4985]: I0224 10:11:26.436189 4985 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-xj9h5" podStartSLOduration=124.436169418 podStartE2EDuration="2m4.436169418s" podCreationTimestamp="2026-02-24 10:09:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-24 10:11:26.423802009 +0000 UTC m=+170.897994569" watchObservedRunningTime="2026-02-24 10:11:26.436169418 +0000 UTC m=+170.910361978" Feb 24 10:11:26 crc kubenswrapper[4985]: I0224 10:11:26.436586 4985 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-daemon-hq52w" podStartSLOduration=124.436582019 podStartE2EDuration="2m4.436582019s" podCreationTimestamp="2026-02-24 10:09:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-24 10:11:26.436357283 +0000 UTC m=+170.910549843" watchObservedRunningTime="2026-02-24 10:11:26.436582019 +0000 UTC m=+170.910774579" Feb 24 10:11:26 crc kubenswrapper[4985]: I0224 10:11:26.460955 4985 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-jj7jq" podStartSLOduration=124.460938786 podStartE2EDuration="2m4.460938786s" podCreationTimestamp="2026-02-24 10:09:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-24 10:11:26.449376331 +0000 UTC m=+170.923568891" watchObservedRunningTime="2026-02-24 10:11:26.460938786 +0000 UTC m=+170.935131336" Feb 24 10:11:26 crc kubenswrapper[4985]: I0224 10:11:26.520052 4985 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-qtqms" podStartSLOduration=124.520021721 podStartE2EDuration="2m4.520021721s" podCreationTimestamp="2026-02-24 10:09:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-24 10:11:26.519675372 +0000 UTC m=+170.993867932" watchObservedRunningTime="2026-02-24 10:11:26.520021721 +0000 UTC m=+170.994214321" Feb 24 10:11:26 crc kubenswrapper[4985]: I0224 10:11:26.541541 4985 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" podStartSLOduration=10.541526497 podStartE2EDuration="10.541526497s" podCreationTimestamp="2026-02-24 10:11:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-24 10:11:26.541522137 +0000 UTC m=+171.015714727" watchObservedRunningTime="2026-02-24 10:11:26.541526497 +0000 UTC m=+171.015719057" Feb 24 10:11:26 crc kubenswrapper[4985]: I0224 10:11:26.541733 4985 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" podStartSLOduration=42.541729022 podStartE2EDuration="42.541729022s" podCreationTimestamp="2026-02-24 10:10:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-24 10:11:26.532420991 +0000 UTC m=+171.006613561" watchObservedRunningTime="2026-02-24 10:11:26.541729022 +0000 UTC m=+171.015921582" Feb 24 10:11:26 crc kubenswrapper[4985]: I0224 10:11:26.566334 4985 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd/etcd-crc" podStartSLOduration=56.566315566 podStartE2EDuration="56.566315566s" podCreationTimestamp="2026-02-24 10:10:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-24 10:11:26.565569055 +0000 UTC m=+171.039761615" watchObservedRunningTime="2026-02-24 10:11:26.566315566 +0000 UTC m=+171.040508136" Feb 24 10:11:27 crc kubenswrapper[4985]: I0224 10:11:27.263994 4985 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xkc65" Feb 24 10:11:27 crc kubenswrapper[4985]: I0224 10:11:27.264116 4985 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 24 10:11:27 crc kubenswrapper[4985]: E0224 10:11:27.264260 4985 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xkc65" podUID="d4340d1a-60cb-4240-87ba-1e468c9c41cf" Feb 24 10:11:27 crc kubenswrapper[4985]: I0224 10:11:27.264407 4985 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 24 10:11:27 crc kubenswrapper[4985]: I0224 10:11:27.264427 4985 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 24 10:11:27 crc kubenswrapper[4985]: E0224 10:11:27.264566 4985 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 24 10:11:27 crc kubenswrapper[4985]: E0224 10:11:27.265114 4985 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 24 10:11:27 crc kubenswrapper[4985]: E0224 10:11:27.265270 4985 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 24 10:11:27 crc kubenswrapper[4985]: I0224 10:11:27.265384 4985 scope.go:117] "RemoveContainer" containerID="df7ed5513fec3c391f8a44d1039dd8a9d0e7c587428383737726821dfab5d0b1" Feb 24 10:11:27 crc kubenswrapper[4985]: E0224 10:11:27.265558 4985 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-27dpt_openshift-ovn-kubernetes(1b3986ef-e9be-43db-9350-ccc7dd3f713f)\"" pod="openshift-ovn-kubernetes/ovnkube-node-27dpt" podUID="1b3986ef-e9be-43db-9350-ccc7dd3f713f" Feb 24 10:11:29 crc kubenswrapper[4985]: I0224 10:11:29.075987 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:11:29 crc kubenswrapper[4985]: I0224 10:11:29.076030 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:11:29 crc kubenswrapper[4985]: I0224 10:11:29.076047 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:11:29 crc kubenswrapper[4985]: I0224 10:11:29.076071 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 10:11:29 crc kubenswrapper[4985]: I0224 10:11:29.076091 4985 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T10:11:29Z","lastTransitionTime":"2026-02-24T10:11:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 10:11:29 crc kubenswrapper[4985]: I0224 10:11:29.139782 4985 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-version/cluster-version-operator-5c965bbfc6-hh7k5"] Feb 24 10:11:29 crc kubenswrapper[4985]: I0224 10:11:29.140334 4985 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-hh7k5" Feb 24 10:11:29 crc kubenswrapper[4985]: I0224 10:11:29.143114 4985 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Feb 24 10:11:29 crc kubenswrapper[4985]: I0224 10:11:29.144202 4985 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Feb 24 10:11:29 crc kubenswrapper[4985]: I0224 10:11:29.144307 4985 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Feb 24 10:11:29 crc kubenswrapper[4985]: I0224 10:11:29.144729 4985 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Feb 24 10:11:29 crc kubenswrapper[4985]: I0224 10:11:29.239783 4985 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/a0e2e597-dd85-4b96-ac3e-67bc94cc1c2c-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-hh7k5\" (UID: \"a0e2e597-dd85-4b96-ac3e-67bc94cc1c2c\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-hh7k5" Feb 24 10:11:29 crc kubenswrapper[4985]: I0224 10:11:29.239936 4985 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a0e2e597-dd85-4b96-ac3e-67bc94cc1c2c-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-hh7k5\" (UID: \"a0e2e597-dd85-4b96-ac3e-67bc94cc1c2c\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-hh7k5" Feb 24 10:11:29 crc kubenswrapper[4985]: I0224 10:11:29.239975 4985 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/a0e2e597-dd85-4b96-ac3e-67bc94cc1c2c-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-hh7k5\" (UID: \"a0e2e597-dd85-4b96-ac3e-67bc94cc1c2c\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-hh7k5" Feb 24 10:11:29 crc kubenswrapper[4985]: I0224 10:11:29.240006 4985 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/a0e2e597-dd85-4b96-ac3e-67bc94cc1c2c-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-hh7k5\" (UID: \"a0e2e597-dd85-4b96-ac3e-67bc94cc1c2c\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-hh7k5" Feb 24 10:11:29 crc kubenswrapper[4985]: I0224 10:11:29.240062 4985 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/a0e2e597-dd85-4b96-ac3e-67bc94cc1c2c-service-ca\") pod \"cluster-version-operator-5c965bbfc6-hh7k5\" (UID: \"a0e2e597-dd85-4b96-ac3e-67bc94cc1c2c\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-hh7k5" Feb 24 10:11:29 crc kubenswrapper[4985]: I0224 10:11:29.263772 4985 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 24 10:11:29 crc kubenswrapper[4985]: I0224 10:11:29.263876 4985 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xkc65" Feb 24 10:11:29 crc kubenswrapper[4985]: I0224 10:11:29.263797 4985 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 24 10:11:29 crc kubenswrapper[4985]: I0224 10:11:29.263978 4985 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 24 10:11:29 crc kubenswrapper[4985]: E0224 10:11:29.264287 4985 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 24 10:11:29 crc kubenswrapper[4985]: E0224 10:11:29.264454 4985 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 24 10:11:29 crc kubenswrapper[4985]: E0224 10:11:29.264695 4985 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xkc65" podUID="d4340d1a-60cb-4240-87ba-1e468c9c41cf" Feb 24 10:11:29 crc kubenswrapper[4985]: E0224 10:11:29.264817 4985 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 24 10:11:29 crc kubenswrapper[4985]: I0224 10:11:29.293122 4985 certificate_manager.go:356] kubernetes.io/kubelet-serving: Rotating certificates Feb 24 10:11:29 crc kubenswrapper[4985]: I0224 10:11:29.304412 4985 reflector.go:368] Caches populated for *v1.CertificateSigningRequest from k8s.io/client-go/tools/watch/informerwatcher.go:146 Feb 24 10:11:29 crc kubenswrapper[4985]: I0224 10:11:29.341494 4985 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/a0e2e597-dd85-4b96-ac3e-67bc94cc1c2c-service-ca\") pod \"cluster-version-operator-5c965bbfc6-hh7k5\" (UID: \"a0e2e597-dd85-4b96-ac3e-67bc94cc1c2c\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-hh7k5" Feb 24 10:11:29 crc kubenswrapper[4985]: I0224 10:11:29.341587 4985 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/a0e2e597-dd85-4b96-ac3e-67bc94cc1c2c-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-hh7k5\" (UID: \"a0e2e597-dd85-4b96-ac3e-67bc94cc1c2c\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-hh7k5" Feb 24 10:11:29 crc kubenswrapper[4985]: I0224 10:11:29.341663 4985 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a0e2e597-dd85-4b96-ac3e-67bc94cc1c2c-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-hh7k5\" (UID: \"a0e2e597-dd85-4b96-ac3e-67bc94cc1c2c\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-hh7k5" Feb 24 10:11:29 crc kubenswrapper[4985]: I0224 10:11:29.341701 4985 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/a0e2e597-dd85-4b96-ac3e-67bc94cc1c2c-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-hh7k5\" (UID: \"a0e2e597-dd85-4b96-ac3e-67bc94cc1c2c\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-hh7k5" Feb 24 10:11:29 crc kubenswrapper[4985]: I0224 10:11:29.341734 4985 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/a0e2e597-dd85-4b96-ac3e-67bc94cc1c2c-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-hh7k5\" (UID: \"a0e2e597-dd85-4b96-ac3e-67bc94cc1c2c\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-hh7k5" Feb 24 10:11:29 crc kubenswrapper[4985]: I0224 10:11:29.342228 4985 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/a0e2e597-dd85-4b96-ac3e-67bc94cc1c2c-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-hh7k5\" (UID: \"a0e2e597-dd85-4b96-ac3e-67bc94cc1c2c\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-hh7k5" Feb 24 10:11:29 crc kubenswrapper[4985]: I0224 10:11:29.342720 4985 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/a0e2e597-dd85-4b96-ac3e-67bc94cc1c2c-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-hh7k5\" (UID: \"a0e2e597-dd85-4b96-ac3e-67bc94cc1c2c\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-hh7k5" Feb 24 10:11:29 crc kubenswrapper[4985]: I0224 10:11:29.343920 4985 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/a0e2e597-dd85-4b96-ac3e-67bc94cc1c2c-service-ca\") pod \"cluster-version-operator-5c965bbfc6-hh7k5\" (UID: \"a0e2e597-dd85-4b96-ac3e-67bc94cc1c2c\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-hh7k5" Feb 24 10:11:29 crc kubenswrapper[4985]: I0224 10:11:29.355265 4985 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a0e2e597-dd85-4b96-ac3e-67bc94cc1c2c-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-hh7k5\" (UID: \"a0e2e597-dd85-4b96-ac3e-67bc94cc1c2c\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-hh7k5" Feb 24 10:11:29 crc kubenswrapper[4985]: I0224 10:11:29.365774 4985 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/a0e2e597-dd85-4b96-ac3e-67bc94cc1c2c-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-hh7k5\" (UID: \"a0e2e597-dd85-4b96-ac3e-67bc94cc1c2c\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-hh7k5" Feb 24 10:11:29 crc kubenswrapper[4985]: I0224 10:11:29.469720 4985 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-hh7k5" Feb 24 10:11:30 crc kubenswrapper[4985]: I0224 10:11:30.059008 4985 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-hh7k5" event={"ID":"a0e2e597-dd85-4b96-ac3e-67bc94cc1c2c","Type":"ContainerStarted","Data":"4df180b2cc5687c48fe8f7adb695d19deaf74b0fa6100aeffe2d85b4a673f192"} Feb 24 10:11:30 crc kubenswrapper[4985]: I0224 10:11:30.059097 4985 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-hh7k5" event={"ID":"a0e2e597-dd85-4b96-ac3e-67bc94cc1c2c","Type":"ContainerStarted","Data":"97838947861cba0d2cdaf7f005012098ec7dd823f5e9dd16e7e805799ceafb54"} Feb 24 10:11:31 crc kubenswrapper[4985]: I0224 10:11:31.263539 4985 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 24 10:11:31 crc kubenswrapper[4985]: I0224 10:11:31.263573 4985 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xkc65" Feb 24 10:11:31 crc kubenswrapper[4985]: I0224 10:11:31.263551 4985 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 24 10:11:31 crc kubenswrapper[4985]: I0224 10:11:31.263540 4985 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 24 10:11:31 crc kubenswrapper[4985]: E0224 10:11:31.263671 4985 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 24 10:11:31 crc kubenswrapper[4985]: E0224 10:11:31.263740 4985 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xkc65" podUID="d4340d1a-60cb-4240-87ba-1e468c9c41cf" Feb 24 10:11:31 crc kubenswrapper[4985]: E0224 10:11:31.263865 4985 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 24 10:11:31 crc kubenswrapper[4985]: E0224 10:11:31.264041 4985 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 24 10:11:31 crc kubenswrapper[4985]: E0224 10:11:31.381162 4985 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 24 10:11:33 crc kubenswrapper[4985]: I0224 10:11:33.264053 4985 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 24 10:11:33 crc kubenswrapper[4985]: I0224 10:11:33.264084 4985 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 24 10:11:33 crc kubenswrapper[4985]: I0224 10:11:33.264119 4985 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 24 10:11:33 crc kubenswrapper[4985]: I0224 10:11:33.264208 4985 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xkc65" Feb 24 10:11:33 crc kubenswrapper[4985]: E0224 10:11:33.264550 4985 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 24 10:11:33 crc kubenswrapper[4985]: E0224 10:11:33.264929 4985 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 24 10:11:33 crc kubenswrapper[4985]: E0224 10:11:33.264979 4985 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 24 10:11:33 crc kubenswrapper[4985]: E0224 10:11:33.265096 4985 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xkc65" podUID="d4340d1a-60cb-4240-87ba-1e468c9c41cf" Feb 24 10:11:35 crc kubenswrapper[4985]: I0224 10:11:35.264369 4985 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xkc65" Feb 24 10:11:35 crc kubenswrapper[4985]: I0224 10:11:35.264485 4985 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 24 10:11:35 crc kubenswrapper[4985]: E0224 10:11:35.264526 4985 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xkc65" podUID="d4340d1a-60cb-4240-87ba-1e468c9c41cf" Feb 24 10:11:35 crc kubenswrapper[4985]: I0224 10:11:35.264547 4985 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 24 10:11:35 crc kubenswrapper[4985]: I0224 10:11:35.264596 4985 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 24 10:11:35 crc kubenswrapper[4985]: E0224 10:11:35.264706 4985 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 24 10:11:35 crc kubenswrapper[4985]: E0224 10:11:35.264796 4985 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 24 10:11:35 crc kubenswrapper[4985]: E0224 10:11:35.264950 4985 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 24 10:11:36 crc kubenswrapper[4985]: E0224 10:11:36.381544 4985 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 24 10:11:37 crc kubenswrapper[4985]: I0224 10:11:37.264309 4985 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xkc65" Feb 24 10:11:37 crc kubenswrapper[4985]: I0224 10:11:37.264385 4985 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 24 10:11:37 crc kubenswrapper[4985]: E0224 10:11:37.264428 4985 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xkc65" podUID="d4340d1a-60cb-4240-87ba-1e468c9c41cf" Feb 24 10:11:37 crc kubenswrapper[4985]: I0224 10:11:37.264485 4985 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 24 10:11:37 crc kubenswrapper[4985]: I0224 10:11:37.264624 4985 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 24 10:11:37 crc kubenswrapper[4985]: E0224 10:11:37.264687 4985 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 24 10:11:37 crc kubenswrapper[4985]: E0224 10:11:37.264778 4985 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 24 10:11:37 crc kubenswrapper[4985]: E0224 10:11:37.264833 4985 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 24 10:11:38 crc kubenswrapper[4985]: I0224 10:11:38.265594 4985 scope.go:117] "RemoveContainer" containerID="df7ed5513fec3c391f8a44d1039dd8a9d0e7c587428383737726821dfab5d0b1" Feb 24 10:11:38 crc kubenswrapper[4985]: E0224 10:11:38.265854 4985 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-27dpt_openshift-ovn-kubernetes(1b3986ef-e9be-43db-9350-ccc7dd3f713f)\"" pod="openshift-ovn-kubernetes/ovnkube-node-27dpt" podUID="1b3986ef-e9be-43db-9350-ccc7dd3f713f" Feb 24 10:11:39 crc kubenswrapper[4985]: I0224 10:11:39.264497 4985 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xkc65" Feb 24 10:11:39 crc kubenswrapper[4985]: I0224 10:11:39.264586 4985 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 24 10:11:39 crc kubenswrapper[4985]: E0224 10:11:39.264908 4985 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xkc65" podUID="d4340d1a-60cb-4240-87ba-1e468c9c41cf" Feb 24 10:11:39 crc kubenswrapper[4985]: I0224 10:11:39.264647 4985 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 24 10:11:39 crc kubenswrapper[4985]: I0224 10:11:39.264623 4985 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 24 10:11:39 crc kubenswrapper[4985]: E0224 10:11:39.265146 4985 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 24 10:11:39 crc kubenswrapper[4985]: E0224 10:11:39.265327 4985 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 24 10:11:39 crc kubenswrapper[4985]: E0224 10:11:39.265479 4985 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 24 10:11:41 crc kubenswrapper[4985]: I0224 10:11:41.263736 4985 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 24 10:11:41 crc kubenswrapper[4985]: I0224 10:11:41.263762 4985 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 24 10:11:41 crc kubenswrapper[4985]: E0224 10:11:41.263881 4985 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 24 10:11:41 crc kubenswrapper[4985]: I0224 10:11:41.263955 4985 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xkc65" Feb 24 10:11:41 crc kubenswrapper[4985]: I0224 10:11:41.264002 4985 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 24 10:11:41 crc kubenswrapper[4985]: E0224 10:11:41.264133 4985 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xkc65" podUID="d4340d1a-60cb-4240-87ba-1e468c9c41cf" Feb 24 10:11:41 crc kubenswrapper[4985]: E0224 10:11:41.264230 4985 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 24 10:11:41 crc kubenswrapper[4985]: E0224 10:11:41.264313 4985 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 24 10:11:41 crc kubenswrapper[4985]: E0224 10:11:41.383296 4985 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 24 10:11:43 crc kubenswrapper[4985]: I0224 10:11:43.264514 4985 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 24 10:11:43 crc kubenswrapper[4985]: I0224 10:11:43.264534 4985 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 24 10:11:43 crc kubenswrapper[4985]: I0224 10:11:43.264548 4985 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 24 10:11:43 crc kubenswrapper[4985]: E0224 10:11:43.264736 4985 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 24 10:11:43 crc kubenswrapper[4985]: E0224 10:11:43.264923 4985 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 24 10:11:43 crc kubenswrapper[4985]: E0224 10:11:43.264994 4985 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 24 10:11:43 crc kubenswrapper[4985]: I0224 10:11:43.266163 4985 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xkc65" Feb 24 10:11:43 crc kubenswrapper[4985]: E0224 10:11:43.266409 4985 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xkc65" podUID="d4340d1a-60cb-4240-87ba-1e468c9c41cf" Feb 24 10:11:45 crc kubenswrapper[4985]: I0224 10:11:45.263963 4985 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 24 10:11:45 crc kubenswrapper[4985]: I0224 10:11:45.264000 4985 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 24 10:11:45 crc kubenswrapper[4985]: I0224 10:11:45.264037 4985 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xkc65" Feb 24 10:11:45 crc kubenswrapper[4985]: I0224 10:11:45.264263 4985 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 24 10:11:45 crc kubenswrapper[4985]: E0224 10:11:45.264261 4985 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 24 10:11:45 crc kubenswrapper[4985]: E0224 10:11:45.264352 4985 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 24 10:11:45 crc kubenswrapper[4985]: E0224 10:11:45.264434 4985 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xkc65" podUID="d4340d1a-60cb-4240-87ba-1e468c9c41cf" Feb 24 10:11:45 crc kubenswrapper[4985]: E0224 10:11:45.264827 4985 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 24 10:11:46 crc kubenswrapper[4985]: E0224 10:11:46.383708 4985 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 24 10:11:47 crc kubenswrapper[4985]: I0224 10:11:47.121105 4985 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-q24bf_731349d2-7b07-4bc9-81f8-c7d75bca842a/kube-multus/1.log" Feb 24 10:11:47 crc kubenswrapper[4985]: I0224 10:11:47.121738 4985 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-q24bf_731349d2-7b07-4bc9-81f8-c7d75bca842a/kube-multus/0.log" Feb 24 10:11:47 crc kubenswrapper[4985]: I0224 10:11:47.121813 4985 generic.go:334] "Generic (PLEG): container finished" podID="731349d2-7b07-4bc9-81f8-c7d75bca842a" containerID="9ab2ed5b6b7b76ded6468be8fb5375bccf9f79a9b916fd1cc4fc2bc192140eb4" exitCode=1 Feb 24 10:11:47 crc kubenswrapper[4985]: I0224 10:11:47.121857 4985 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-q24bf" event={"ID":"731349d2-7b07-4bc9-81f8-c7d75bca842a","Type":"ContainerDied","Data":"9ab2ed5b6b7b76ded6468be8fb5375bccf9f79a9b916fd1cc4fc2bc192140eb4"} Feb 24 10:11:47 crc kubenswrapper[4985]: I0224 10:11:47.121934 4985 scope.go:117] "RemoveContainer" containerID="85e965ad144b3440864b3c9de70aacd2a2e49c715dca204b0e4cfae65954cd9e" Feb 24 10:11:47 crc kubenswrapper[4985]: I0224 10:11:47.122547 4985 scope.go:117] "RemoveContainer" containerID="9ab2ed5b6b7b76ded6468be8fb5375bccf9f79a9b916fd1cc4fc2bc192140eb4" Feb 24 10:11:47 crc kubenswrapper[4985]: E0224 10:11:47.127035 4985 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-multus pod=multus-q24bf_openshift-multus(731349d2-7b07-4bc9-81f8-c7d75bca842a)\"" pod="openshift-multus/multus-q24bf" podUID="731349d2-7b07-4bc9-81f8-c7d75bca842a" Feb 24 10:11:47 crc kubenswrapper[4985]: I0224 10:11:47.143930 4985 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-hh7k5" podStartSLOduration=145.143910535 podStartE2EDuration="2m25.143910535s" podCreationTimestamp="2026-02-24 10:09:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-24 10:11:30.075433118 +0000 UTC m=+174.549625728" watchObservedRunningTime="2026-02-24 10:11:47.143910535 +0000 UTC m=+191.618103105" Feb 24 10:11:47 crc kubenswrapper[4985]: I0224 10:11:47.264077 4985 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 24 10:11:47 crc kubenswrapper[4985]: I0224 10:11:47.264128 4985 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 24 10:11:47 crc kubenswrapper[4985]: I0224 10:11:47.264167 4985 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 24 10:11:47 crc kubenswrapper[4985]: I0224 10:11:47.264235 4985 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xkc65" Feb 24 10:11:47 crc kubenswrapper[4985]: E0224 10:11:47.264250 4985 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 24 10:11:47 crc kubenswrapper[4985]: E0224 10:11:47.264323 4985 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 24 10:11:47 crc kubenswrapper[4985]: E0224 10:11:47.264444 4985 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 24 10:11:47 crc kubenswrapper[4985]: E0224 10:11:47.264591 4985 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xkc65" podUID="d4340d1a-60cb-4240-87ba-1e468c9c41cf" Feb 24 10:11:48 crc kubenswrapper[4985]: I0224 10:11:48.128477 4985 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-q24bf_731349d2-7b07-4bc9-81f8-c7d75bca842a/kube-multus/1.log" Feb 24 10:11:49 crc kubenswrapper[4985]: I0224 10:11:49.264072 4985 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xkc65" Feb 24 10:11:49 crc kubenswrapper[4985]: I0224 10:11:49.264208 4985 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 24 10:11:49 crc kubenswrapper[4985]: I0224 10:11:49.264208 4985 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 24 10:11:49 crc kubenswrapper[4985]: E0224 10:11:49.264277 4985 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xkc65" podUID="d4340d1a-60cb-4240-87ba-1e468c9c41cf" Feb 24 10:11:49 crc kubenswrapper[4985]: I0224 10:11:49.264384 4985 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 24 10:11:49 crc kubenswrapper[4985]: E0224 10:11:49.264610 4985 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 24 10:11:49 crc kubenswrapper[4985]: E0224 10:11:49.264691 4985 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 24 10:11:49 crc kubenswrapper[4985]: E0224 10:11:49.265230 4985 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 24 10:11:51 crc kubenswrapper[4985]: I0224 10:11:51.264110 4985 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 24 10:11:51 crc kubenswrapper[4985]: I0224 10:11:51.264156 4985 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xkc65" Feb 24 10:11:51 crc kubenswrapper[4985]: I0224 10:11:51.264119 4985 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 24 10:11:51 crc kubenswrapper[4985]: I0224 10:11:51.264212 4985 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 24 10:11:51 crc kubenswrapper[4985]: E0224 10:11:51.264359 4985 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xkc65" podUID="d4340d1a-60cb-4240-87ba-1e468c9c41cf" Feb 24 10:11:51 crc kubenswrapper[4985]: E0224 10:11:51.264530 4985 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 24 10:11:51 crc kubenswrapper[4985]: E0224 10:11:51.264599 4985 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 24 10:11:51 crc kubenswrapper[4985]: E0224 10:11:51.264692 4985 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 24 10:11:51 crc kubenswrapper[4985]: E0224 10:11:51.385201 4985 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 24 10:11:52 crc kubenswrapper[4985]: I0224 10:11:52.266221 4985 scope.go:117] "RemoveContainer" containerID="df7ed5513fec3c391f8a44d1039dd8a9d0e7c587428383737726821dfab5d0b1" Feb 24 10:11:53 crc kubenswrapper[4985]: I0224 10:11:53.137163 4985 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-xkc65"] Feb 24 10:11:53 crc kubenswrapper[4985]: I0224 10:11:53.137805 4985 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xkc65" Feb 24 10:11:53 crc kubenswrapper[4985]: E0224 10:11:53.137968 4985 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xkc65" podUID="d4340d1a-60cb-4240-87ba-1e468c9c41cf" Feb 24 10:11:53 crc kubenswrapper[4985]: I0224 10:11:53.149070 4985 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-27dpt_1b3986ef-e9be-43db-9350-ccc7dd3f713f/ovnkube-controller/3.log" Feb 24 10:11:53 crc kubenswrapper[4985]: I0224 10:11:53.152210 4985 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-27dpt" event={"ID":"1b3986ef-e9be-43db-9350-ccc7dd3f713f","Type":"ContainerStarted","Data":"0deac0a06dc9dd969ef518376b34b0974f7314f897c1b21a0620a804e1bad55b"} Feb 24 10:11:53 crc kubenswrapper[4985]: I0224 10:11:53.152846 4985 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-27dpt" Feb 24 10:11:53 crc kubenswrapper[4985]: I0224 10:11:53.207515 4985 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-27dpt" podStartSLOduration=151.207482949 podStartE2EDuration="2m31.207482949s" podCreationTimestamp="2026-02-24 10:09:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-24 10:11:53.201483426 +0000 UTC m=+197.675676006" watchObservedRunningTime="2026-02-24 10:11:53.207482949 +0000 UTC m=+197.681675529" Feb 24 10:11:53 crc kubenswrapper[4985]: I0224 10:11:53.264069 4985 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 24 10:11:53 crc kubenswrapper[4985]: E0224 10:11:53.264216 4985 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 24 10:11:53 crc kubenswrapper[4985]: I0224 10:11:53.264282 4985 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 24 10:11:53 crc kubenswrapper[4985]: I0224 10:11:53.264314 4985 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 24 10:11:53 crc kubenswrapper[4985]: E0224 10:11:53.264369 4985 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 24 10:11:53 crc kubenswrapper[4985]: E0224 10:11:53.264533 4985 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 24 10:11:55 crc kubenswrapper[4985]: I0224 10:11:55.264564 4985 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 24 10:11:55 crc kubenswrapper[4985]: I0224 10:11:55.264596 4985 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 24 10:11:55 crc kubenswrapper[4985]: I0224 10:11:55.264643 4985 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 24 10:11:55 crc kubenswrapper[4985]: E0224 10:11:55.265276 4985 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 24 10:11:55 crc kubenswrapper[4985]: E0224 10:11:55.265075 4985 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 24 10:11:55 crc kubenswrapper[4985]: I0224 10:11:55.264665 4985 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xkc65" Feb 24 10:11:55 crc kubenswrapper[4985]: E0224 10:11:55.265336 4985 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 24 10:11:55 crc kubenswrapper[4985]: E0224 10:11:55.265416 4985 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xkc65" podUID="d4340d1a-60cb-4240-87ba-1e468c9c41cf" Feb 24 10:11:56 crc kubenswrapper[4985]: E0224 10:11:56.385800 4985 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 24 10:11:57 crc kubenswrapper[4985]: I0224 10:11:57.264093 4985 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 24 10:11:57 crc kubenswrapper[4985]: I0224 10:11:57.264264 4985 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 24 10:11:57 crc kubenswrapper[4985]: E0224 10:11:57.264324 4985 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 24 10:11:57 crc kubenswrapper[4985]: I0224 10:11:57.264432 4985 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xkc65" Feb 24 10:11:57 crc kubenswrapper[4985]: I0224 10:11:57.264491 4985 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 24 10:11:57 crc kubenswrapper[4985]: E0224 10:11:57.264566 4985 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 24 10:11:57 crc kubenswrapper[4985]: E0224 10:11:57.264654 4985 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xkc65" podUID="d4340d1a-60cb-4240-87ba-1e468c9c41cf" Feb 24 10:11:57 crc kubenswrapper[4985]: E0224 10:11:57.264704 4985 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 24 10:11:59 crc kubenswrapper[4985]: I0224 10:11:59.263984 4985 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 24 10:11:59 crc kubenswrapper[4985]: I0224 10:11:59.263999 4985 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 24 10:11:59 crc kubenswrapper[4985]: I0224 10:11:59.264021 4985 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xkc65" Feb 24 10:11:59 crc kubenswrapper[4985]: E0224 10:11:59.264162 4985 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 24 10:11:59 crc kubenswrapper[4985]: E0224 10:11:59.264225 4985 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 24 10:11:59 crc kubenswrapper[4985]: I0224 10:11:59.264284 4985 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 24 10:11:59 crc kubenswrapper[4985]: E0224 10:11:59.264378 4985 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xkc65" podUID="d4340d1a-60cb-4240-87ba-1e468c9c41cf" Feb 24 10:11:59 crc kubenswrapper[4985]: E0224 10:11:59.264457 4985 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 24 10:12:01 crc kubenswrapper[4985]: I0224 10:12:01.264400 4985 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 24 10:12:01 crc kubenswrapper[4985]: I0224 10:12:01.264456 4985 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xkc65" Feb 24 10:12:01 crc kubenswrapper[4985]: I0224 10:12:01.264489 4985 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 24 10:12:01 crc kubenswrapper[4985]: I0224 10:12:01.264542 4985 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 24 10:12:01 crc kubenswrapper[4985]: E0224 10:12:01.264670 4985 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 24 10:12:01 crc kubenswrapper[4985]: E0224 10:12:01.264843 4985 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xkc65" podUID="d4340d1a-60cb-4240-87ba-1e468c9c41cf" Feb 24 10:12:01 crc kubenswrapper[4985]: E0224 10:12:01.265012 4985 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 24 10:12:01 crc kubenswrapper[4985]: E0224 10:12:01.265267 4985 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 24 10:12:01 crc kubenswrapper[4985]: E0224 10:12:01.387810 4985 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 24 10:12:02 crc kubenswrapper[4985]: I0224 10:12:02.264532 4985 scope.go:117] "RemoveContainer" containerID="9ab2ed5b6b7b76ded6468be8fb5375bccf9f79a9b916fd1cc4fc2bc192140eb4" Feb 24 10:12:03 crc kubenswrapper[4985]: I0224 10:12:03.190326 4985 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-q24bf_731349d2-7b07-4bc9-81f8-c7d75bca842a/kube-multus/1.log" Feb 24 10:12:03 crc kubenswrapper[4985]: I0224 10:12:03.190694 4985 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-q24bf" event={"ID":"731349d2-7b07-4bc9-81f8-c7d75bca842a","Type":"ContainerStarted","Data":"45ef150d9f586a382ad3aad2d1f3a60b3f6c286f0151c6dd77de5400353fb59a"} Feb 24 10:12:03 crc kubenswrapper[4985]: I0224 10:12:03.263829 4985 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 24 10:12:03 crc kubenswrapper[4985]: I0224 10:12:03.263868 4985 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 24 10:12:03 crc kubenswrapper[4985]: I0224 10:12:03.263931 4985 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xkc65" Feb 24 10:12:03 crc kubenswrapper[4985]: I0224 10:12:03.263915 4985 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 24 10:12:03 crc kubenswrapper[4985]: E0224 10:12:03.264020 4985 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 24 10:12:03 crc kubenswrapper[4985]: E0224 10:12:03.264125 4985 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xkc65" podUID="d4340d1a-60cb-4240-87ba-1e468c9c41cf" Feb 24 10:12:03 crc kubenswrapper[4985]: E0224 10:12:03.264221 4985 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 24 10:12:03 crc kubenswrapper[4985]: E0224 10:12:03.264308 4985 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 24 10:12:05 crc kubenswrapper[4985]: I0224 10:12:05.263865 4985 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 24 10:12:05 crc kubenswrapper[4985]: I0224 10:12:05.263974 4985 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 24 10:12:05 crc kubenswrapper[4985]: I0224 10:12:05.263990 4985 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xkc65" Feb 24 10:12:05 crc kubenswrapper[4985]: I0224 10:12:05.263874 4985 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 24 10:12:05 crc kubenswrapper[4985]: E0224 10:12:05.264081 4985 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 24 10:12:05 crc kubenswrapper[4985]: E0224 10:12:05.264222 4985 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 24 10:12:05 crc kubenswrapper[4985]: E0224 10:12:05.264298 4985 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 24 10:12:05 crc kubenswrapper[4985]: E0224 10:12:05.264350 4985 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xkc65" podUID="d4340d1a-60cb-4240-87ba-1e468c9c41cf" Feb 24 10:12:07 crc kubenswrapper[4985]: I0224 10:12:07.264443 4985 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xkc65" Feb 24 10:12:07 crc kubenswrapper[4985]: I0224 10:12:07.264523 4985 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 24 10:12:07 crc kubenswrapper[4985]: I0224 10:12:07.265630 4985 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 24 10:12:07 crc kubenswrapper[4985]: I0224 10:12:07.265654 4985 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 24 10:12:07 crc kubenswrapper[4985]: I0224 10:12:07.269639 4985 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Feb 24 10:12:07 crc kubenswrapper[4985]: I0224 10:12:07.270063 4985 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Feb 24 10:12:07 crc kubenswrapper[4985]: I0224 10:12:07.270499 4985 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Feb 24 10:12:07 crc kubenswrapper[4985]: I0224 10:12:07.309102 4985 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Feb 24 10:12:07 crc kubenswrapper[4985]: I0224 10:12:07.310007 4985 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Feb 24 10:12:07 crc kubenswrapper[4985]: I0224 10:12:07.310034 4985 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Feb 24 10:12:10 crc kubenswrapper[4985]: I0224 10:12:10.183115 4985 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeReady" Feb 24 10:12:10 crc kubenswrapper[4985]: I0224 10:12:10.232786 4985 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-nnr7f"] Feb 24 10:12:10 crc kubenswrapper[4985]: I0224 10:12:10.233444 4985 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-nnr7f" Feb 24 10:12:10 crc kubenswrapper[4985]: I0224 10:12:10.233984 4985 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-t79l8"] Feb 24 10:12:10 crc kubenswrapper[4985]: I0224 10:12:10.234825 4985 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-t79l8" Feb 24 10:12:10 crc kubenswrapper[4985]: I0224 10:12:10.234964 4985 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-nncfz"] Feb 24 10:12:10 crc kubenswrapper[4985]: I0224 10:12:10.235420 4985 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-nncfz" Feb 24 10:12:10 crc kubenswrapper[4985]: I0224 10:12:10.240050 4985 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-f9d7485db-d7v4s"] Feb 24 10:12:10 crc kubenswrapper[4985]: I0224 10:12:10.240532 4985 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-d7v4s" Feb 24 10:12:10 crc kubenswrapper[4985]: I0224 10:12:10.242281 4985 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Feb 24 10:12:10 crc kubenswrapper[4985]: I0224 10:12:10.242481 4985 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Feb 24 10:12:10 crc kubenswrapper[4985]: I0224 10:12:10.242576 4985 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Feb 24 10:12:10 crc kubenswrapper[4985]: I0224 10:12:10.242808 4985 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Feb 24 10:12:10 crc kubenswrapper[4985]: I0224 10:12:10.243251 4985 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Feb 24 10:12:10 crc kubenswrapper[4985]: I0224 10:12:10.243332 4985 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Feb 24 10:12:10 crc kubenswrapper[4985]: I0224 10:12:10.243386 4985 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Feb 24 10:12:10 crc kubenswrapper[4985]: I0224 10:12:10.243507 4985 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Feb 24 10:12:10 crc kubenswrapper[4985]: I0224 10:12:10.243756 4985 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Feb 24 10:12:10 crc kubenswrapper[4985]: I0224 10:12:10.243989 4985 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Feb 24 10:12:10 crc kubenswrapper[4985]: W0224 10:12:10.244340 4985 reflector.go:561] object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff": failed to list *v1.Secret: secrets "openshift-apiserver-sa-dockercfg-djjff" is forbidden: User "system:node:crc" cannot list resource "secrets" in API group "" in the namespace "openshift-apiserver": no relationship found between node 'crc' and this object Feb 24 10:12:10 crc kubenswrapper[4985]: E0224 10:12:10.244397 4985 reflector.go:158] "Unhandled Error" err="object-\"openshift-apiserver\"/\"openshift-apiserver-sa-dockercfg-djjff\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"openshift-apiserver-sa-dockercfg-djjff\" is forbidden: User \"system:node:crc\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openshift-apiserver\": no relationship found between node 'crc' and this object" logger="UnhandledError" Feb 24 10:12:10 crc kubenswrapper[4985]: I0224 10:12:10.245020 4985 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Feb 24 10:12:10 crc kubenswrapper[4985]: I0224 10:12:10.245575 4985 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Feb 24 10:12:10 crc kubenswrapper[4985]: I0224 10:12:10.245663 4985 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Feb 24 10:12:10 crc kubenswrapper[4985]: I0224 10:12:10.246044 4985 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Feb 24 10:12:10 crc kubenswrapper[4985]: I0224 10:12:10.246136 4985 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Feb 24 10:12:10 crc kubenswrapper[4985]: I0224 10:12:10.246277 4985 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Feb 24 10:12:10 crc kubenswrapper[4985]: I0224 10:12:10.246367 4985 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-jlxbx"] Feb 24 10:12:10 crc kubenswrapper[4985]: I0224 10:12:10.246949 4985 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-z6mz6"] Feb 24 10:12:10 crc kubenswrapper[4985]: I0224 10:12:10.249623 4985 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-jlxbx" Feb 24 10:12:10 crc kubenswrapper[4985]: I0224 10:12:10.250631 4985 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Feb 24 10:12:10 crc kubenswrapper[4985]: I0224 10:12:10.252590 4985 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-t527v"] Feb 24 10:12:10 crc kubenswrapper[4985]: I0224 10:12:10.254098 4985 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-z6mz6" Feb 24 10:12:10 crc kubenswrapper[4985]: I0224 10:12:10.254313 4985 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-t527v" Feb 24 10:12:10 crc kubenswrapper[4985]: I0224 10:12:10.259071 4985 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Feb 24 10:12:10 crc kubenswrapper[4985]: I0224 10:12:10.259292 4985 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-58897d9998-v8tln"] Feb 24 10:12:10 crc kubenswrapper[4985]: I0224 10:12:10.259353 4985 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Feb 24 10:12:10 crc kubenswrapper[4985]: I0224 10:12:10.260181 4985 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-v8tln" Feb 24 10:12:10 crc kubenswrapper[4985]: I0224 10:12:10.279541 4985 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Feb 24 10:12:10 crc kubenswrapper[4985]: I0224 10:12:10.285233 4985 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-xh5q8"] Feb 24 10:12:10 crc kubenswrapper[4985]: I0224 10:12:10.285799 4985 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-xh5q8" Feb 24 10:12:10 crc kubenswrapper[4985]: I0224 10:12:10.291384 4985 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/4958ed59-9dc5-4e53-a250-feb5516c6c97-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-z6mz6\" (UID: \"4958ed59-9dc5-4e53-a250-feb5516c6c97\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-z6mz6" Feb 24 10:12:10 crc kubenswrapper[4985]: I0224 10:12:10.291454 4985 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/8c5e6fc2-42c9-4794-91cc-1f74adf686db-console-oauth-config\") pod \"console-f9d7485db-d7v4s\" (UID: \"8c5e6fc2-42c9-4794-91cc-1f74adf686db\") " pod="openshift-console/console-f9d7485db-d7v4s" Feb 24 10:12:10 crc kubenswrapper[4985]: I0224 10:12:10.291485 4985 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/8c5e6fc2-42c9-4794-91cc-1f74adf686db-service-ca\") pod \"console-f9d7485db-d7v4s\" (UID: \"8c5e6fc2-42c9-4794-91cc-1f74adf686db\") " pod="openshift-console/console-f9d7485db-d7v4s" Feb 24 10:12:10 crc kubenswrapper[4985]: I0224 10:12:10.291512 4985 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7tdhc\" (UniqueName: \"kubernetes.io/projected/8905bada-0871-41b5-9a21-83e4d3884edf-kube-api-access-7tdhc\") pod \"apiserver-76f77b778f-nnr7f\" (UID: \"8905bada-0871-41b5-9a21-83e4d3884edf\") " pod="openshift-apiserver/apiserver-76f77b778f-nnr7f" Feb 24 10:12:10 crc kubenswrapper[4985]: I0224 10:12:10.291534 4985 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/8c5e6fc2-42c9-4794-91cc-1f74adf686db-console-serving-cert\") pod \"console-f9d7485db-d7v4s\" (UID: \"8c5e6fc2-42c9-4794-91cc-1f74adf686db\") " pod="openshift-console/console-f9d7485db-d7v4s" Feb 24 10:12:10 crc kubenswrapper[4985]: I0224 10:12:10.291560 4985 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6lthw\" (UniqueName: \"kubernetes.io/projected/8c5e6fc2-42c9-4794-91cc-1f74adf686db-kube-api-access-6lthw\") pod \"console-f9d7485db-d7v4s\" (UID: \"8c5e6fc2-42c9-4794-91cc-1f74adf686db\") " pod="openshift-console/console-f9d7485db-d7v4s" Feb 24 10:12:10 crc kubenswrapper[4985]: I0224 10:12:10.291583 4985 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/20592ddc-38f3-4407-a5c3-b719b89cd42e-config\") pod \"route-controller-manager-6576b87f9c-t527v\" (UID: \"20592ddc-38f3-4407-a5c3-b719b89cd42e\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-t527v" Feb 24 10:12:10 crc kubenswrapper[4985]: I0224 10:12:10.291605 4985 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fbsq2\" (UniqueName: \"kubernetes.io/projected/4958ed59-9dc5-4e53-a250-feb5516c6c97-kube-api-access-fbsq2\") pod \"cluster-samples-operator-665b6dd947-z6mz6\" (UID: \"4958ed59-9dc5-4e53-a250-feb5516c6c97\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-z6mz6" Feb 24 10:12:10 crc kubenswrapper[4985]: I0224 10:12:10.291631 4985 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/8905bada-0871-41b5-9a21-83e4d3884edf-image-import-ca\") pod \"apiserver-76f77b778f-nnr7f\" (UID: \"8905bada-0871-41b5-9a21-83e4d3884edf\") " pod="openshift-apiserver/apiserver-76f77b778f-nnr7f" Feb 24 10:12:10 crc kubenswrapper[4985]: I0224 10:12:10.291657 4985 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/d93e05b9-4ed8-4cd6-a961-25cf43ed4cf9-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-jlxbx\" (UID: \"d93e05b9-4ed8-4cd6-a961-25cf43ed4cf9\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-jlxbx" Feb 24 10:12:10 crc kubenswrapper[4985]: I0224 10:12:10.291684 4985 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/8905bada-0871-41b5-9a21-83e4d3884edf-node-pullsecrets\") pod \"apiserver-76f77b778f-nnr7f\" (UID: \"8905bada-0871-41b5-9a21-83e4d3884edf\") " pod="openshift-apiserver/apiserver-76f77b778f-nnr7f" Feb 24 10:12:10 crc kubenswrapper[4985]: I0224 10:12:10.291709 4985 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8c5e6fc2-42c9-4794-91cc-1f74adf686db-trusted-ca-bundle\") pod \"console-f9d7485db-d7v4s\" (UID: \"8c5e6fc2-42c9-4794-91cc-1f74adf686db\") " pod="openshift-console/console-f9d7485db-d7v4s" Feb 24 10:12:10 crc kubenswrapper[4985]: I0224 10:12:10.291733 4985 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/8905bada-0871-41b5-9a21-83e4d3884edf-etcd-client\") pod \"apiserver-76f77b778f-nnr7f\" (UID: \"8905bada-0871-41b5-9a21-83e4d3884edf\") " pod="openshift-apiserver/apiserver-76f77b778f-nnr7f" Feb 24 10:12:10 crc kubenswrapper[4985]: I0224 10:12:10.291769 4985 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d93e05b9-4ed8-4cd6-a961-25cf43ed4cf9-serving-cert\") pod \"apiserver-7bbb656c7d-jlxbx\" (UID: \"d93e05b9-4ed8-4cd6-a961-25cf43ed4cf9\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-jlxbx" Feb 24 10:12:10 crc kubenswrapper[4985]: I0224 10:12:10.291794 4985 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bbf9fdf8-7dd4-4bb2-9ff7-0349600747f8-config\") pod \"controller-manager-879f6c89f-nncfz\" (UID: \"bbf9fdf8-7dd4-4bb2-9ff7-0349600747f8\") " pod="openshift-controller-manager/controller-manager-879f6c89f-nncfz" Feb 24 10:12:10 crc kubenswrapper[4985]: I0224 10:12:10.291821 4985 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/8905bada-0871-41b5-9a21-83e4d3884edf-etcd-serving-ca\") pod \"apiserver-76f77b778f-nnr7f\" (UID: \"8905bada-0871-41b5-9a21-83e4d3884edf\") " pod="openshift-apiserver/apiserver-76f77b778f-nnr7f" Feb 24 10:12:10 crc kubenswrapper[4985]: I0224 10:12:10.291844 4985 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d93e05b9-4ed8-4cd6-a961-25cf43ed4cf9-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-jlxbx\" (UID: \"d93e05b9-4ed8-4cd6-a961-25cf43ed4cf9\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-jlxbx" Feb 24 10:12:10 crc kubenswrapper[4985]: I0224 10:12:10.291869 4985 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/d93e05b9-4ed8-4cd6-a961-25cf43ed4cf9-encryption-config\") pod \"apiserver-7bbb656c7d-jlxbx\" (UID: \"d93e05b9-4ed8-4cd6-a961-25cf43ed4cf9\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-jlxbx" Feb 24 10:12:10 crc kubenswrapper[4985]: I0224 10:12:10.291912 4985 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/bbf9fdf8-7dd4-4bb2-9ff7-0349600747f8-client-ca\") pod \"controller-manager-879f6c89f-nncfz\" (UID: \"bbf9fdf8-7dd4-4bb2-9ff7-0349600747f8\") " pod="openshift-controller-manager/controller-manager-879f6c89f-nncfz" Feb 24 10:12:10 crc kubenswrapper[4985]: I0224 10:12:10.291940 4985 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/20592ddc-38f3-4407-a5c3-b719b89cd42e-client-ca\") pod \"route-controller-manager-6576b87f9c-t527v\" (UID: \"20592ddc-38f3-4407-a5c3-b719b89cd42e\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-t527v" Feb 24 10:12:10 crc kubenswrapper[4985]: I0224 10:12:10.291965 4985 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/20592ddc-38f3-4407-a5c3-b719b89cd42e-serving-cert\") pod \"route-controller-manager-6576b87f9c-t527v\" (UID: \"20592ddc-38f3-4407-a5c3-b719b89cd42e\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-t527v" Feb 24 10:12:10 crc kubenswrapper[4985]: I0224 10:12:10.291991 4985 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/8905bada-0871-41b5-9a21-83e4d3884edf-audit-dir\") pod \"apiserver-76f77b778f-nnr7f\" (UID: \"8905bada-0871-41b5-9a21-83e4d3884edf\") " pod="openshift-apiserver/apiserver-76f77b778f-nnr7f" Feb 24 10:12:10 crc kubenswrapper[4985]: I0224 10:12:10.292015 4985 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/d93e05b9-4ed8-4cd6-a961-25cf43ed4cf9-audit-policies\") pod \"apiserver-7bbb656c7d-jlxbx\" (UID: \"d93e05b9-4ed8-4cd6-a961-25cf43ed4cf9\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-jlxbx" Feb 24 10:12:10 crc kubenswrapper[4985]: I0224 10:12:10.292044 4985 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d98f046c-a925-4946-9571-4aba6008f4b5-config\") pod \"machine-api-operator-5694c8668f-t79l8\" (UID: \"d98f046c-a925-4946-9571-4aba6008f4b5\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-t79l8" Feb 24 10:12:10 crc kubenswrapper[4985]: I0224 10:12:10.292079 4985 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/8905bada-0871-41b5-9a21-83e4d3884edf-encryption-config\") pod \"apiserver-76f77b778f-nnr7f\" (UID: \"8905bada-0871-41b5-9a21-83e4d3884edf\") " pod="openshift-apiserver/apiserver-76f77b778f-nnr7f" Feb 24 10:12:10 crc kubenswrapper[4985]: I0224 10:12:10.292103 4985 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8skx2\" (UniqueName: \"kubernetes.io/projected/bbf9fdf8-7dd4-4bb2-9ff7-0349600747f8-kube-api-access-8skx2\") pod \"controller-manager-879f6c89f-nncfz\" (UID: \"bbf9fdf8-7dd4-4bb2-9ff7-0349600747f8\") " pod="openshift-controller-manager/controller-manager-879f6c89f-nncfz" Feb 24 10:12:10 crc kubenswrapper[4985]: I0224 10:12:10.292124 4985 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8905bada-0871-41b5-9a21-83e4d3884edf-trusted-ca-bundle\") pod \"apiserver-76f77b778f-nnr7f\" (UID: \"8905bada-0871-41b5-9a21-83e4d3884edf\") " pod="openshift-apiserver/apiserver-76f77b778f-nnr7f" Feb 24 10:12:10 crc kubenswrapper[4985]: I0224 10:12:10.292146 4985 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fd6t2\" (UniqueName: \"kubernetes.io/projected/d93e05b9-4ed8-4cd6-a961-25cf43ed4cf9-kube-api-access-fd6t2\") pod \"apiserver-7bbb656c7d-jlxbx\" (UID: \"d93e05b9-4ed8-4cd6-a961-25cf43ed4cf9\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-jlxbx" Feb 24 10:12:10 crc kubenswrapper[4985]: I0224 10:12:10.292165 4985 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/d98f046c-a925-4946-9571-4aba6008f4b5-images\") pod \"machine-api-operator-5694c8668f-t79l8\" (UID: \"d98f046c-a925-4946-9571-4aba6008f4b5\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-t79l8" Feb 24 10:12:10 crc kubenswrapper[4985]: I0224 10:12:10.292188 4985 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/bbf9fdf8-7dd4-4bb2-9ff7-0349600747f8-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-nncfz\" (UID: \"bbf9fdf8-7dd4-4bb2-9ff7-0349600747f8\") " pod="openshift-controller-manager/controller-manager-879f6c89f-nncfz" Feb 24 10:12:10 crc kubenswrapper[4985]: I0224 10:12:10.292210 4985 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/d93e05b9-4ed8-4cd6-a961-25cf43ed4cf9-audit-dir\") pod \"apiserver-7bbb656c7d-jlxbx\" (UID: \"d93e05b9-4ed8-4cd6-a961-25cf43ed4cf9\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-jlxbx" Feb 24 10:12:10 crc kubenswrapper[4985]: I0224 10:12:10.292257 4985 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bbf9fdf8-7dd4-4bb2-9ff7-0349600747f8-serving-cert\") pod \"controller-manager-879f6c89f-nncfz\" (UID: \"bbf9fdf8-7dd4-4bb2-9ff7-0349600747f8\") " pod="openshift-controller-manager/controller-manager-879f6c89f-nncfz" Feb 24 10:12:10 crc kubenswrapper[4985]: I0224 10:12:10.292278 4985 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/8905bada-0871-41b5-9a21-83e4d3884edf-audit\") pod \"apiserver-76f77b778f-nnr7f\" (UID: \"8905bada-0871-41b5-9a21-83e4d3884edf\") " pod="openshift-apiserver/apiserver-76f77b778f-nnr7f" Feb 24 10:12:10 crc kubenswrapper[4985]: I0224 10:12:10.292302 4985 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/8c5e6fc2-42c9-4794-91cc-1f74adf686db-console-config\") pod \"console-f9d7485db-d7v4s\" (UID: \"8c5e6fc2-42c9-4794-91cc-1f74adf686db\") " pod="openshift-console/console-f9d7485db-d7v4s" Feb 24 10:12:10 crc kubenswrapper[4985]: I0224 10:12:10.292323 4985 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bbk9r\" (UniqueName: \"kubernetes.io/projected/20592ddc-38f3-4407-a5c3-b719b89cd42e-kube-api-access-bbk9r\") pod \"route-controller-manager-6576b87f9c-t527v\" (UID: \"20592ddc-38f3-4407-a5c3-b719b89cd42e\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-t527v" Feb 24 10:12:10 crc kubenswrapper[4985]: I0224 10:12:10.292345 4985 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8905bada-0871-41b5-9a21-83e4d3884edf-serving-cert\") pod \"apiserver-76f77b778f-nnr7f\" (UID: \"8905bada-0871-41b5-9a21-83e4d3884edf\") " pod="openshift-apiserver/apiserver-76f77b778f-nnr7f" Feb 24 10:12:10 crc kubenswrapper[4985]: I0224 10:12:10.292367 4985 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/d98f046c-a925-4946-9571-4aba6008f4b5-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-t79l8\" (UID: \"d98f046c-a925-4946-9571-4aba6008f4b5\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-t79l8" Feb 24 10:12:10 crc kubenswrapper[4985]: I0224 10:12:10.292388 4985 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mxjjr\" (UniqueName: \"kubernetes.io/projected/d98f046c-a925-4946-9571-4aba6008f4b5-kube-api-access-mxjjr\") pod \"machine-api-operator-5694c8668f-t79l8\" (UID: \"d98f046c-a925-4946-9571-4aba6008f4b5\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-t79l8" Feb 24 10:12:10 crc kubenswrapper[4985]: I0224 10:12:10.292470 4985 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/d93e05b9-4ed8-4cd6-a961-25cf43ed4cf9-etcd-client\") pod \"apiserver-7bbb656c7d-jlxbx\" (UID: \"d93e05b9-4ed8-4cd6-a961-25cf43ed4cf9\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-jlxbx" Feb 24 10:12:10 crc kubenswrapper[4985]: I0224 10:12:10.292495 4985 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8905bada-0871-41b5-9a21-83e4d3884edf-config\") pod \"apiserver-76f77b778f-nnr7f\" (UID: \"8905bada-0871-41b5-9a21-83e4d3884edf\") " pod="openshift-apiserver/apiserver-76f77b778f-nnr7f" Feb 24 10:12:10 crc kubenswrapper[4985]: I0224 10:12:10.292520 4985 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/8c5e6fc2-42c9-4794-91cc-1f74adf686db-oauth-serving-cert\") pod \"console-f9d7485db-d7v4s\" (UID: \"8c5e6fc2-42c9-4794-91cc-1f74adf686db\") " pod="openshift-console/console-f9d7485db-d7v4s" Feb 24 10:12:10 crc kubenswrapper[4985]: I0224 10:12:10.302336 4985 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Feb 24 10:12:10 crc kubenswrapper[4985]: I0224 10:12:10.316777 4985 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Feb 24 10:12:10 crc kubenswrapper[4985]: I0224 10:12:10.325194 4985 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Feb 24 10:12:10 crc kubenswrapper[4985]: I0224 10:12:10.329523 4985 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Feb 24 10:12:10 crc kubenswrapper[4985]: I0224 10:12:10.329570 4985 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Feb 24 10:12:10 crc kubenswrapper[4985]: I0224 10:12:10.329748 4985 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Feb 24 10:12:10 crc kubenswrapper[4985]: I0224 10:12:10.329767 4985 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Feb 24 10:12:10 crc kubenswrapper[4985]: I0224 10:12:10.330065 4985 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Feb 24 10:12:10 crc kubenswrapper[4985]: I0224 10:12:10.333635 4985 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Feb 24 10:12:10 crc kubenswrapper[4985]: I0224 10:12:10.334550 4985 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Feb 24 10:12:10 crc kubenswrapper[4985]: I0224 10:12:10.335194 4985 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-machine-approver/machine-approver-56656f9798-sbwld"] Feb 24 10:12:10 crc kubenswrapper[4985]: I0224 10:12:10.336097 4985 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-sbwld" Feb 24 10:12:10 crc kubenswrapper[4985]: I0224 10:12:10.337194 4985 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-bl6z5"] Feb 24 10:12:10 crc kubenswrapper[4985]: I0224 10:12:10.337567 4985 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-n8mff"] Feb 24 10:12:10 crc kubenswrapper[4985]: I0224 10:12:10.337962 4985 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-n8mff" Feb 24 10:12:10 crc kubenswrapper[4985]: I0224 10:12:10.338064 4985 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-bl6z5" Feb 24 10:12:10 crc kubenswrapper[4985]: I0224 10:12:10.346503 4985 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-cs28d"] Feb 24 10:12:10 crc kubenswrapper[4985]: I0224 10:12:10.346939 4985 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-cs28d" Feb 24 10:12:10 crc kubenswrapper[4985]: I0224 10:12:10.349951 4985 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Feb 24 10:12:10 crc kubenswrapper[4985]: I0224 10:12:10.350099 4985 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Feb 24 10:12:10 crc kubenswrapper[4985]: I0224 10:12:10.350308 4985 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Feb 24 10:12:10 crc kubenswrapper[4985]: I0224 10:12:10.350501 4985 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Feb 24 10:12:10 crc kubenswrapper[4985]: I0224 10:12:10.350503 4985 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Feb 24 10:12:10 crc kubenswrapper[4985]: I0224 10:12:10.360558 4985 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Feb 24 10:12:10 crc kubenswrapper[4985]: I0224 10:12:10.360847 4985 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Feb 24 10:12:10 crc kubenswrapper[4985]: I0224 10:12:10.361113 4985 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Feb 24 10:12:10 crc kubenswrapper[4985]: I0224 10:12:10.361250 4985 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Feb 24 10:12:10 crc kubenswrapper[4985]: I0224 10:12:10.361434 4985 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Feb 24 10:12:10 crc kubenswrapper[4985]: I0224 10:12:10.361593 4985 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Feb 24 10:12:10 crc kubenswrapper[4985]: I0224 10:12:10.361718 4985 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Feb 24 10:12:10 crc kubenswrapper[4985]: I0224 10:12:10.365870 4985 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Feb 24 10:12:10 crc kubenswrapper[4985]: I0224 10:12:10.366555 4985 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Feb 24 10:12:10 crc kubenswrapper[4985]: I0224 10:12:10.367043 4985 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Feb 24 10:12:10 crc kubenswrapper[4985]: I0224 10:12:10.371305 4985 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Feb 24 10:12:10 crc kubenswrapper[4985]: I0224 10:12:10.384574 4985 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Feb 24 10:12:10 crc kubenswrapper[4985]: I0224 10:12:10.393349 4985 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Feb 24 10:12:10 crc kubenswrapper[4985]: I0224 10:12:10.393949 4985 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Feb 24 10:12:10 crc kubenswrapper[4985]: I0224 10:12:10.394125 4985 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Feb 24 10:12:10 crc kubenswrapper[4985]: I0224 10:12:10.394284 4985 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Feb 24 10:12:10 crc kubenswrapper[4985]: I0224 10:12:10.394449 4985 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Feb 24 10:12:10 crc kubenswrapper[4985]: I0224 10:12:10.395048 4985 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Feb 24 10:12:10 crc kubenswrapper[4985]: I0224 10:12:10.396557 4985 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Feb 24 10:12:10 crc kubenswrapper[4985]: I0224 10:12:10.396847 4985 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Feb 24 10:12:10 crc kubenswrapper[4985]: I0224 10:12:10.396569 4985 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Feb 24 10:12:10 crc kubenswrapper[4985]: I0224 10:12:10.397068 4985 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Feb 24 10:12:10 crc kubenswrapper[4985]: I0224 10:12:10.397121 4985 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Feb 24 10:12:10 crc kubenswrapper[4985]: I0224 10:12:10.397226 4985 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Feb 24 10:12:10 crc kubenswrapper[4985]: I0224 10:12:10.397384 4985 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Feb 24 10:12:10 crc kubenswrapper[4985]: I0224 10:12:10.397606 4985 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Feb 24 10:12:10 crc kubenswrapper[4985]: I0224 10:12:10.397771 4985 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Feb 24 10:12:10 crc kubenswrapper[4985]: I0224 10:12:10.397806 4985 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Feb 24 10:12:10 crc kubenswrapper[4985]: I0224 10:12:10.397960 4985 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Feb 24 10:12:10 crc kubenswrapper[4985]: I0224 10:12:10.398075 4985 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Feb 24 10:12:10 crc kubenswrapper[4985]: I0224 10:12:10.398100 4985 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Feb 24 10:12:10 crc kubenswrapper[4985]: I0224 10:12:10.398197 4985 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Feb 24 10:12:10 crc kubenswrapper[4985]: I0224 10:12:10.398278 4985 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Feb 24 10:12:10 crc kubenswrapper[4985]: I0224 10:12:10.398286 4985 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Feb 24 10:12:10 crc kubenswrapper[4985]: I0224 10:12:10.398215 4985 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-7954f5f757-zjjx8"] Feb 24 10:12:10 crc kubenswrapper[4985]: I0224 10:12:10.398524 4985 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Feb 24 10:12:10 crc kubenswrapper[4985]: I0224 10:12:10.398609 4985 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Feb 24 10:12:10 crc kubenswrapper[4985]: I0224 10:12:10.398685 4985 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Feb 24 10:12:10 crc kubenswrapper[4985]: I0224 10:12:10.398697 4985 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8905bada-0871-41b5-9a21-83e4d3884edf-config\") pod \"apiserver-76f77b778f-nnr7f\" (UID: \"8905bada-0871-41b5-9a21-83e4d3884edf\") " pod="openshift-apiserver/apiserver-76f77b778f-nnr7f" Feb 24 10:12:10 crc kubenswrapper[4985]: I0224 10:12:10.398722 4985 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qvdrz\" (UniqueName: \"kubernetes.io/projected/5dba6bb5-c7aa-45b0-82b4-990febac8e99-kube-api-access-qvdrz\") pod \"openshift-config-operator-7777fb866f-n8mff\" (UID: \"5dba6bb5-c7aa-45b0-82b4-990febac8e99\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-n8mff" Feb 24 10:12:10 crc kubenswrapper[4985]: I0224 10:12:10.398740 4985 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/8c5e6fc2-42c9-4794-91cc-1f74adf686db-oauth-serving-cert\") pod \"console-f9d7485db-d7v4s\" (UID: \"8c5e6fc2-42c9-4794-91cc-1f74adf686db\") " pod="openshift-console/console-f9d7485db-d7v4s" Feb 24 10:12:10 crc kubenswrapper[4985]: I0224 10:12:10.398758 4985 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Feb 24 10:12:10 crc kubenswrapper[4985]: I0224 10:12:10.398986 4985 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Feb 24 10:12:10 crc kubenswrapper[4985]: I0224 10:12:10.399030 4985 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-ktpsr"] Feb 24 10:12:10 crc kubenswrapper[4985]: I0224 10:12:10.399095 4985 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Feb 24 10:12:10 crc kubenswrapper[4985]: I0224 10:12:10.399196 4985 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Feb 24 10:12:10 crc kubenswrapper[4985]: I0224 10:12:10.398758 4985 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/4958ed59-9dc5-4e53-a250-feb5516c6c97-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-z6mz6\" (UID: \"4958ed59-9dc5-4e53-a250-feb5516c6c97\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-z6mz6" Feb 24 10:12:10 crc kubenswrapper[4985]: I0224 10:12:10.399313 4985 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/175055a6-4d1f-47c1-8947-287934749c2b-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-cs28d\" (UID: \"175055a6-4d1f-47c1-8947-287934749c2b\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-cs28d" Feb 24 10:12:10 crc kubenswrapper[4985]: I0224 10:12:10.399348 4985 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/8c5e6fc2-42c9-4794-91cc-1f74adf686db-console-oauth-config\") pod \"console-f9d7485db-d7v4s\" (UID: \"8c5e6fc2-42c9-4794-91cc-1f74adf686db\") " pod="openshift-console/console-f9d7485db-d7v4s" Feb 24 10:12:10 crc kubenswrapper[4985]: I0224 10:12:10.399369 4985 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/efdfb8c2-12c0-4134-a05d-f7c880e70649-service-ca-bundle\") pod \"authentication-operator-69f744f599-bl6z5\" (UID: \"efdfb8c2-12c0-4134-a05d-f7c880e70649\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-bl6z5" Feb 24 10:12:10 crc kubenswrapper[4985]: I0224 10:12:10.399398 4985 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/ee57ec8e-3901-4355-b744-5ed2eeb20c9d-audit-policies\") pod \"oauth-openshift-558db77b4-xh5q8\" (UID: \"ee57ec8e-3901-4355-b744-5ed2eeb20c9d\") " pod="openshift-authentication/oauth-openshift-558db77b4-xh5q8" Feb 24 10:12:10 crc kubenswrapper[4985]: I0224 10:12:10.399432 4985 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/175055a6-4d1f-47c1-8947-287934749c2b-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-cs28d\" (UID: \"175055a6-4d1f-47c1-8947-287934749c2b\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-cs28d" Feb 24 10:12:10 crc kubenswrapper[4985]: I0224 10:12:10.399454 4985 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8d200ef1-f292-460a-a971-1ac47d688fb1-config\") pod \"console-operator-58897d9998-v8tln\" (UID: \"8d200ef1-f292-460a-a971-1ac47d688fb1\") " pod="openshift-console-operator/console-operator-58897d9998-v8tln" Feb 24 10:12:10 crc kubenswrapper[4985]: I0224 10:12:10.399476 4985 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/8c5e6fc2-42c9-4794-91cc-1f74adf686db-service-ca\") pod \"console-f9d7485db-d7v4s\" (UID: \"8c5e6fc2-42c9-4794-91cc-1f74adf686db\") " pod="openshift-console/console-f9d7485db-d7v4s" Feb 24 10:12:10 crc kubenswrapper[4985]: I0224 10:12:10.399494 4985 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7tdhc\" (UniqueName: \"kubernetes.io/projected/8905bada-0871-41b5-9a21-83e4d3884edf-kube-api-access-7tdhc\") pod \"apiserver-76f77b778f-nnr7f\" (UID: \"8905bada-0871-41b5-9a21-83e4d3884edf\") " pod="openshift-apiserver/apiserver-76f77b778f-nnr7f" Feb 24 10:12:10 crc kubenswrapper[4985]: I0224 10:12:10.399514 4985 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/ee57ec8e-3901-4355-b744-5ed2eeb20c9d-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-xh5q8\" (UID: \"ee57ec8e-3901-4355-b744-5ed2eeb20c9d\") " pod="openshift-authentication/oauth-openshift-558db77b4-xh5q8" Feb 24 10:12:10 crc kubenswrapper[4985]: I0224 10:12:10.399535 4985 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4vnbg\" (UniqueName: \"kubernetes.io/projected/175055a6-4d1f-47c1-8947-287934749c2b-kube-api-access-4vnbg\") pod \"openshift-controller-manager-operator-756b6f6bc6-cs28d\" (UID: \"175055a6-4d1f-47c1-8947-287934749c2b\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-cs28d" Feb 24 10:12:10 crc kubenswrapper[4985]: I0224 10:12:10.399561 4985 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6lthw\" (UniqueName: \"kubernetes.io/projected/8c5e6fc2-42c9-4794-91cc-1f74adf686db-kube-api-access-6lthw\") pod \"console-f9d7485db-d7v4s\" (UID: \"8c5e6fc2-42c9-4794-91cc-1f74adf686db\") " pod="openshift-console/console-f9d7485db-d7v4s" Feb 24 10:12:10 crc kubenswrapper[4985]: I0224 10:12:10.399581 4985 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/20592ddc-38f3-4407-a5c3-b719b89cd42e-config\") pod \"route-controller-manager-6576b87f9c-t527v\" (UID: \"20592ddc-38f3-4407-a5c3-b719b89cd42e\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-t527v" Feb 24 10:12:10 crc kubenswrapper[4985]: I0224 10:12:10.399599 4985 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fbsq2\" (UniqueName: \"kubernetes.io/projected/4958ed59-9dc5-4e53-a250-feb5516c6c97-kube-api-access-fbsq2\") pod \"cluster-samples-operator-665b6dd947-z6mz6\" (UID: \"4958ed59-9dc5-4e53-a250-feb5516c6c97\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-z6mz6" Feb 24 10:12:10 crc kubenswrapper[4985]: I0224 10:12:10.399617 4985 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/8905bada-0871-41b5-9a21-83e4d3884edf-image-import-ca\") pod \"apiserver-76f77b778f-nnr7f\" (UID: \"8905bada-0871-41b5-9a21-83e4d3884edf\") " pod="openshift-apiserver/apiserver-76f77b778f-nnr7f" Feb 24 10:12:10 crc kubenswrapper[4985]: I0224 10:12:10.399633 4985 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/d93e05b9-4ed8-4cd6-a961-25cf43ed4cf9-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-jlxbx\" (UID: \"d93e05b9-4ed8-4cd6-a961-25cf43ed4cf9\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-jlxbx" Feb 24 10:12:10 crc kubenswrapper[4985]: I0224 10:12:10.399648 4985 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8mtzk\" (UniqueName: \"kubernetes.io/projected/8d200ef1-f292-460a-a971-1ac47d688fb1-kube-api-access-8mtzk\") pod \"console-operator-58897d9998-v8tln\" (UID: \"8d200ef1-f292-460a-a971-1ac47d688fb1\") " pod="openshift-console-operator/console-operator-58897d9998-v8tln" Feb 24 10:12:10 crc kubenswrapper[4985]: I0224 10:12:10.399664 4985 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/8c5e6fc2-42c9-4794-91cc-1f74adf686db-console-serving-cert\") pod \"console-f9d7485db-d7v4s\" (UID: \"8c5e6fc2-42c9-4794-91cc-1f74adf686db\") " pod="openshift-console/console-f9d7485db-d7v4s" Feb 24 10:12:10 crc kubenswrapper[4985]: I0224 10:12:10.399679 4985 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/ee57ec8e-3901-4355-b744-5ed2eeb20c9d-audit-dir\") pod \"oauth-openshift-558db77b4-xh5q8\" (UID: \"ee57ec8e-3901-4355-b744-5ed2eeb20c9d\") " pod="openshift-authentication/oauth-openshift-558db77b4-xh5q8" Feb 24 10:12:10 crc kubenswrapper[4985]: I0224 10:12:10.399696 4985 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/efdfb8c2-12c0-4134-a05d-f7c880e70649-config\") pod \"authentication-operator-69f744f599-bl6z5\" (UID: \"efdfb8c2-12c0-4134-a05d-f7c880e70649\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-bl6z5" Feb 24 10:12:10 crc kubenswrapper[4985]: I0224 10:12:10.399714 4985 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/8905bada-0871-41b5-9a21-83e4d3884edf-node-pullsecrets\") pod \"apiserver-76f77b778f-nnr7f\" (UID: \"8905bada-0871-41b5-9a21-83e4d3884edf\") " pod="openshift-apiserver/apiserver-76f77b778f-nnr7f" Feb 24 10:12:10 crc kubenswrapper[4985]: I0224 10:12:10.399732 4985 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8c5e6fc2-42c9-4794-91cc-1f74adf686db-trusted-ca-bundle\") pod \"console-f9d7485db-d7v4s\" (UID: \"8c5e6fc2-42c9-4794-91cc-1f74adf686db\") " pod="openshift-console/console-f9d7485db-d7v4s" Feb 24 10:12:10 crc kubenswrapper[4985]: I0224 10:12:10.399747 4985 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/8905bada-0871-41b5-9a21-83e4d3884edf-etcd-client\") pod \"apiserver-76f77b778f-nnr7f\" (UID: \"8905bada-0871-41b5-9a21-83e4d3884edf\") " pod="openshift-apiserver/apiserver-76f77b778f-nnr7f" Feb 24 10:12:10 crc kubenswrapper[4985]: I0224 10:12:10.399762 4985 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/ee57ec8e-3901-4355-b744-5ed2eeb20c9d-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-xh5q8\" (UID: \"ee57ec8e-3901-4355-b744-5ed2eeb20c9d\") " pod="openshift-authentication/oauth-openshift-558db77b4-xh5q8" Feb 24 10:12:10 crc kubenswrapper[4985]: I0224 10:12:10.399780 4985 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ee57ec8e-3901-4355-b744-5ed2eeb20c9d-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-xh5q8\" (UID: \"ee57ec8e-3901-4355-b744-5ed2eeb20c9d\") " pod="openshift-authentication/oauth-openshift-558db77b4-xh5q8" Feb 24 10:12:10 crc kubenswrapper[4985]: I0224 10:12:10.399809 4985 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d93e05b9-4ed8-4cd6-a961-25cf43ed4cf9-serving-cert\") pod \"apiserver-7bbb656c7d-jlxbx\" (UID: \"d93e05b9-4ed8-4cd6-a961-25cf43ed4cf9\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-jlxbx" Feb 24 10:12:10 crc kubenswrapper[4985]: I0224 10:12:10.399828 4985 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bbf9fdf8-7dd4-4bb2-9ff7-0349600747f8-config\") pod \"controller-manager-879f6c89f-nncfz\" (UID: \"bbf9fdf8-7dd4-4bb2-9ff7-0349600747f8\") " pod="openshift-controller-manager/controller-manager-879f6c89f-nncfz" Feb 24 10:12:10 crc kubenswrapper[4985]: I0224 10:12:10.399845 4985 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/8905bada-0871-41b5-9a21-83e4d3884edf-etcd-serving-ca\") pod \"apiserver-76f77b778f-nnr7f\" (UID: \"8905bada-0871-41b5-9a21-83e4d3884edf\") " pod="openshift-apiserver/apiserver-76f77b778f-nnr7f" Feb 24 10:12:10 crc kubenswrapper[4985]: I0224 10:12:10.399860 4985 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d93e05b9-4ed8-4cd6-a961-25cf43ed4cf9-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-jlxbx\" (UID: \"d93e05b9-4ed8-4cd6-a961-25cf43ed4cf9\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-jlxbx" Feb 24 10:12:10 crc kubenswrapper[4985]: I0224 10:12:10.398246 4985 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Feb 24 10:12:10 crc kubenswrapper[4985]: I0224 10:12:10.399947 4985 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/d93e05b9-4ed8-4cd6-a961-25cf43ed4cf9-encryption-config\") pod \"apiserver-7bbb656c7d-jlxbx\" (UID: \"d93e05b9-4ed8-4cd6-a961-25cf43ed4cf9\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-jlxbx" Feb 24 10:12:10 crc kubenswrapper[4985]: I0224 10:12:10.399968 4985 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/ee57ec8e-3901-4355-b744-5ed2eeb20c9d-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-xh5q8\" (UID: \"ee57ec8e-3901-4355-b744-5ed2eeb20c9d\") " pod="openshift-authentication/oauth-openshift-558db77b4-xh5q8" Feb 24 10:12:10 crc kubenswrapper[4985]: I0224 10:12:10.399984 4985 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qss5w\" (UniqueName: \"kubernetes.io/projected/ee57ec8e-3901-4355-b744-5ed2eeb20c9d-kube-api-access-qss5w\") pod \"oauth-openshift-558db77b4-xh5q8\" (UID: \"ee57ec8e-3901-4355-b744-5ed2eeb20c9d\") " pod="openshift-authentication/oauth-openshift-558db77b4-xh5q8" Feb 24 10:12:10 crc kubenswrapper[4985]: I0224 10:12:10.400001 4985 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/bbf9fdf8-7dd4-4bb2-9ff7-0349600747f8-client-ca\") pod \"controller-manager-879f6c89f-nncfz\" (UID: \"bbf9fdf8-7dd4-4bb2-9ff7-0349600747f8\") " pod="openshift-controller-manager/controller-manager-879f6c89f-nncfz" Feb 24 10:12:10 crc kubenswrapper[4985]: I0224 10:12:10.400021 4985 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/20592ddc-38f3-4407-a5c3-b719b89cd42e-client-ca\") pod \"route-controller-manager-6576b87f9c-t527v\" (UID: \"20592ddc-38f3-4407-a5c3-b719b89cd42e\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-t527v" Feb 24 10:12:10 crc kubenswrapper[4985]: I0224 10:12:10.400036 4985 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/20592ddc-38f3-4407-a5c3-b719b89cd42e-serving-cert\") pod \"route-controller-manager-6576b87f9c-t527v\" (UID: \"20592ddc-38f3-4407-a5c3-b719b89cd42e\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-t527v" Feb 24 10:12:10 crc kubenswrapper[4985]: I0224 10:12:10.400052 4985 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/8905bada-0871-41b5-9a21-83e4d3884edf-audit-dir\") pod \"apiserver-76f77b778f-nnr7f\" (UID: \"8905bada-0871-41b5-9a21-83e4d3884edf\") " pod="openshift-apiserver/apiserver-76f77b778f-nnr7f" Feb 24 10:12:10 crc kubenswrapper[4985]: I0224 10:12:10.400073 4985 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/d93e05b9-4ed8-4cd6-a961-25cf43ed4cf9-audit-policies\") pod \"apiserver-7bbb656c7d-jlxbx\" (UID: \"d93e05b9-4ed8-4cd6-a961-25cf43ed4cf9\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-jlxbx" Feb 24 10:12:10 crc kubenswrapper[4985]: I0224 10:12:10.400092 4985 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/5dba6bb5-c7aa-45b0-82b4-990febac8e99-available-featuregates\") pod \"openshift-config-operator-7777fb866f-n8mff\" (UID: \"5dba6bb5-c7aa-45b0-82b4-990febac8e99\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-n8mff" Feb 24 10:12:10 crc kubenswrapper[4985]: I0224 10:12:10.400106 4985 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8d200ef1-f292-460a-a971-1ac47d688fb1-serving-cert\") pod \"console-operator-58897d9998-v8tln\" (UID: \"8d200ef1-f292-460a-a971-1ac47d688fb1\") " pod="openshift-console-operator/console-operator-58897d9998-v8tln" Feb 24 10:12:10 crc kubenswrapper[4985]: I0224 10:12:10.400122 4985 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d98f046c-a925-4946-9571-4aba6008f4b5-config\") pod \"machine-api-operator-5694c8668f-t79l8\" (UID: \"d98f046c-a925-4946-9571-4aba6008f4b5\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-t79l8" Feb 24 10:12:10 crc kubenswrapper[4985]: I0224 10:12:10.400139 4985 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d8dbe690-7c45-4846-9f75-6ec3cb4bc9ba-config\") pod \"machine-approver-56656f9798-sbwld\" (UID: \"d8dbe690-7c45-4846-9f75-6ec3cb4bc9ba\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-sbwld" Feb 24 10:12:10 crc kubenswrapper[4985]: I0224 10:12:10.400165 4985 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/8905bada-0871-41b5-9a21-83e4d3884edf-encryption-config\") pod \"apiserver-76f77b778f-nnr7f\" (UID: \"8905bada-0871-41b5-9a21-83e4d3884edf\") " pod="openshift-apiserver/apiserver-76f77b778f-nnr7f" Feb 24 10:12:10 crc kubenswrapper[4985]: I0224 10:12:10.400186 4985 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8skx2\" (UniqueName: \"kubernetes.io/projected/bbf9fdf8-7dd4-4bb2-9ff7-0349600747f8-kube-api-access-8skx2\") pod \"controller-manager-879f6c89f-nncfz\" (UID: \"bbf9fdf8-7dd4-4bb2-9ff7-0349600747f8\") " pod="openshift-controller-manager/controller-manager-879f6c89f-nncfz" Feb 24 10:12:10 crc kubenswrapper[4985]: I0224 10:12:10.400203 4985 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fd6t2\" (UniqueName: \"kubernetes.io/projected/d93e05b9-4ed8-4cd6-a961-25cf43ed4cf9-kube-api-access-fd6t2\") pod \"apiserver-7bbb656c7d-jlxbx\" (UID: \"d93e05b9-4ed8-4cd6-a961-25cf43ed4cf9\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-jlxbx" Feb 24 10:12:10 crc kubenswrapper[4985]: I0224 10:12:10.400218 4985 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/ee57ec8e-3901-4355-b744-5ed2eeb20c9d-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-xh5q8\" (UID: \"ee57ec8e-3901-4355-b744-5ed2eeb20c9d\") " pod="openshift-authentication/oauth-openshift-558db77b4-xh5q8" Feb 24 10:12:10 crc kubenswrapper[4985]: I0224 10:12:10.400232 4985 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/d98f046c-a925-4946-9571-4aba6008f4b5-images\") pod \"machine-api-operator-5694c8668f-t79l8\" (UID: \"d98f046c-a925-4946-9571-4aba6008f4b5\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-t79l8" Feb 24 10:12:10 crc kubenswrapper[4985]: I0224 10:12:10.400247 4985 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8905bada-0871-41b5-9a21-83e4d3884edf-trusted-ca-bundle\") pod \"apiserver-76f77b778f-nnr7f\" (UID: \"8905bada-0871-41b5-9a21-83e4d3884edf\") " pod="openshift-apiserver/apiserver-76f77b778f-nnr7f" Feb 24 10:12:10 crc kubenswrapper[4985]: I0224 10:12:10.400263 4985 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/efdfb8c2-12c0-4134-a05d-f7c880e70649-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-bl6z5\" (UID: \"efdfb8c2-12c0-4134-a05d-f7c880e70649\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-bl6z5" Feb 24 10:12:10 crc kubenswrapper[4985]: I0224 10:12:10.400280 4985 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/bbf9fdf8-7dd4-4bb2-9ff7-0349600747f8-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-nncfz\" (UID: \"bbf9fdf8-7dd4-4bb2-9ff7-0349600747f8\") " pod="openshift-controller-manager/controller-manager-879f6c89f-nncfz" Feb 24 10:12:10 crc kubenswrapper[4985]: I0224 10:12:10.400296 4985 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/d8dbe690-7c45-4846-9f75-6ec3cb4bc9ba-machine-approver-tls\") pod \"machine-approver-56656f9798-sbwld\" (UID: \"d8dbe690-7c45-4846-9f75-6ec3cb4bc9ba\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-sbwld" Feb 24 10:12:10 crc kubenswrapper[4985]: I0224 10:12:10.400314 4985 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/d93e05b9-4ed8-4cd6-a961-25cf43ed4cf9-audit-dir\") pod \"apiserver-7bbb656c7d-jlxbx\" (UID: \"d93e05b9-4ed8-4cd6-a961-25cf43ed4cf9\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-jlxbx" Feb 24 10:12:10 crc kubenswrapper[4985]: I0224 10:12:10.400330 4985 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/d8dbe690-7c45-4846-9f75-6ec3cb4bc9ba-auth-proxy-config\") pod \"machine-approver-56656f9798-sbwld\" (UID: \"d8dbe690-7c45-4846-9f75-6ec3cb4bc9ba\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-sbwld" Feb 24 10:12:10 crc kubenswrapper[4985]: I0224 10:12:10.400624 4985 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8905bada-0871-41b5-9a21-83e4d3884edf-config\") pod \"apiserver-76f77b778f-nnr7f\" (UID: \"8905bada-0871-41b5-9a21-83e4d3884edf\") " pod="openshift-apiserver/apiserver-76f77b778f-nnr7f" Feb 24 10:12:10 crc kubenswrapper[4985]: I0224 10:12:10.401249 4985 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/8c5e6fc2-42c9-4794-91cc-1f74adf686db-oauth-serving-cert\") pod \"console-f9d7485db-d7v4s\" (UID: \"8c5e6fc2-42c9-4794-91cc-1f74adf686db\") " pod="openshift-console/console-f9d7485db-d7v4s" Feb 24 10:12:10 crc kubenswrapper[4985]: I0224 10:12:10.401253 4985 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/8c5e6fc2-42c9-4794-91cc-1f74adf686db-service-ca\") pod \"console-f9d7485db-d7v4s\" (UID: \"8c5e6fc2-42c9-4794-91cc-1f74adf686db\") " pod="openshift-console/console-f9d7485db-d7v4s" Feb 24 10:12:10 crc kubenswrapper[4985]: I0224 10:12:10.401414 4985 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Feb 24 10:12:10 crc kubenswrapper[4985]: I0224 10:12:10.401540 4985 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Feb 24 10:12:10 crc kubenswrapper[4985]: I0224 10:12:10.401734 4985 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d93e05b9-4ed8-4cd6-a961-25cf43ed4cf9-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-jlxbx\" (UID: \"d93e05b9-4ed8-4cd6-a961-25cf43ed4cf9\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-jlxbx" Feb 24 10:12:10 crc kubenswrapper[4985]: I0224 10:12:10.402002 4985 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Feb 24 10:12:10 crc kubenswrapper[4985]: I0224 10:12:10.402172 4985 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Feb 24 10:12:10 crc kubenswrapper[4985]: I0224 10:12:10.402367 4985 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Feb 24 10:12:10 crc kubenswrapper[4985]: I0224 10:12:10.403680 4985 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/d93e05b9-4ed8-4cd6-a961-25cf43ed4cf9-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-jlxbx\" (UID: \"d93e05b9-4ed8-4cd6-a961-25cf43ed4cf9\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-jlxbx" Feb 24 10:12:10 crc kubenswrapper[4985]: I0224 10:12:10.403765 4985 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d98f046c-a925-4946-9571-4aba6008f4b5-config\") pod \"machine-api-operator-5694c8668f-t79l8\" (UID: \"d98f046c-a925-4946-9571-4aba6008f4b5\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-t79l8" Feb 24 10:12:10 crc kubenswrapper[4985]: I0224 10:12:10.400366 4985 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8d200ef1-f292-460a-a971-1ac47d688fb1-trusted-ca\") pod \"console-operator-58897d9998-v8tln\" (UID: \"8d200ef1-f292-460a-a971-1ac47d688fb1\") " pod="openshift-console-operator/console-operator-58897d9998-v8tln" Feb 24 10:12:10 crc kubenswrapper[4985]: I0224 10:12:10.404801 4985 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/ee57ec8e-3901-4355-b744-5ed2eeb20c9d-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-xh5q8\" (UID: \"ee57ec8e-3901-4355-b744-5ed2eeb20c9d\") " pod="openshift-authentication/oauth-openshift-558db77b4-xh5q8" Feb 24 10:12:10 crc kubenswrapper[4985]: I0224 10:12:10.404843 4985 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5dba6bb5-c7aa-45b0-82b4-990febac8e99-serving-cert\") pod \"openshift-config-operator-7777fb866f-n8mff\" (UID: \"5dba6bb5-c7aa-45b0-82b4-990febac8e99\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-n8mff" Feb 24 10:12:10 crc kubenswrapper[4985]: I0224 10:12:10.404866 4985 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/ee57ec8e-3901-4355-b744-5ed2eeb20c9d-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-xh5q8\" (UID: \"ee57ec8e-3901-4355-b744-5ed2eeb20c9d\") " pod="openshift-authentication/oauth-openshift-558db77b4-xh5q8" Feb 24 10:12:10 crc kubenswrapper[4985]: I0224 10:12:10.404906 4985 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r2mfz\" (UniqueName: \"kubernetes.io/projected/efdfb8c2-12c0-4134-a05d-f7c880e70649-kube-api-access-r2mfz\") pod \"authentication-operator-69f744f599-bl6z5\" (UID: \"efdfb8c2-12c0-4134-a05d-f7c880e70649\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-bl6z5" Feb 24 10:12:10 crc kubenswrapper[4985]: I0224 10:12:10.404933 4985 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bbf9fdf8-7dd4-4bb2-9ff7-0349600747f8-serving-cert\") pod \"controller-manager-879f6c89f-nncfz\" (UID: \"bbf9fdf8-7dd4-4bb2-9ff7-0349600747f8\") " pod="openshift-controller-manager/controller-manager-879f6c89f-nncfz" Feb 24 10:12:10 crc kubenswrapper[4985]: I0224 10:12:10.404955 4985 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/8905bada-0871-41b5-9a21-83e4d3884edf-audit\") pod \"apiserver-76f77b778f-nnr7f\" (UID: \"8905bada-0871-41b5-9a21-83e4d3884edf\") " pod="openshift-apiserver/apiserver-76f77b778f-nnr7f" Feb 24 10:12:10 crc kubenswrapper[4985]: I0224 10:12:10.404977 4985 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/8c5e6fc2-42c9-4794-91cc-1f74adf686db-console-config\") pod \"console-f9d7485db-d7v4s\" (UID: \"8c5e6fc2-42c9-4794-91cc-1f74adf686db\") " pod="openshift-console/console-f9d7485db-d7v4s" Feb 24 10:12:10 crc kubenswrapper[4985]: I0224 10:12:10.405004 4985 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bbk9r\" (UniqueName: \"kubernetes.io/projected/20592ddc-38f3-4407-a5c3-b719b89cd42e-kube-api-access-bbk9r\") pod \"route-controller-manager-6576b87f9c-t527v\" (UID: \"20592ddc-38f3-4407-a5c3-b719b89cd42e\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-t527v" Feb 24 10:12:10 crc kubenswrapper[4985]: I0224 10:12:10.405022 4985 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8905bada-0871-41b5-9a21-83e4d3884edf-serving-cert\") pod \"apiserver-76f77b778f-nnr7f\" (UID: \"8905bada-0871-41b5-9a21-83e4d3884edf\") " pod="openshift-apiserver/apiserver-76f77b778f-nnr7f" Feb 24 10:12:10 crc kubenswrapper[4985]: I0224 10:12:10.405040 4985 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/ee57ec8e-3901-4355-b744-5ed2eeb20c9d-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-xh5q8\" (UID: \"ee57ec8e-3901-4355-b744-5ed2eeb20c9d\") " pod="openshift-authentication/oauth-openshift-558db77b4-xh5q8" Feb 24 10:12:10 crc kubenswrapper[4985]: I0224 10:12:10.405066 4985 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mxjjr\" (UniqueName: \"kubernetes.io/projected/d98f046c-a925-4946-9571-4aba6008f4b5-kube-api-access-mxjjr\") pod \"machine-api-operator-5694c8668f-t79l8\" (UID: \"d98f046c-a925-4946-9571-4aba6008f4b5\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-t79l8" Feb 24 10:12:10 crc kubenswrapper[4985]: I0224 10:12:10.405083 4985 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/ee57ec8e-3901-4355-b744-5ed2eeb20c9d-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-xh5q8\" (UID: \"ee57ec8e-3901-4355-b744-5ed2eeb20c9d\") " pod="openshift-authentication/oauth-openshift-558db77b4-xh5q8" Feb 24 10:12:10 crc kubenswrapper[4985]: I0224 10:12:10.405100 4985 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/ee57ec8e-3901-4355-b744-5ed2eeb20c9d-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-xh5q8\" (UID: \"ee57ec8e-3901-4355-b744-5ed2eeb20c9d\") " pod="openshift-authentication/oauth-openshift-558db77b4-xh5q8" Feb 24 10:12:10 crc kubenswrapper[4985]: I0224 10:12:10.405116 4985 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/ee57ec8e-3901-4355-b744-5ed2eeb20c9d-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-xh5q8\" (UID: \"ee57ec8e-3901-4355-b744-5ed2eeb20c9d\") " pod="openshift-authentication/oauth-openshift-558db77b4-xh5q8" Feb 24 10:12:10 crc kubenswrapper[4985]: I0224 10:12:10.405134 4985 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-trlvj\" (UniqueName: \"kubernetes.io/projected/d8dbe690-7c45-4846-9f75-6ec3cb4bc9ba-kube-api-access-trlvj\") pod \"machine-approver-56656f9798-sbwld\" (UID: \"d8dbe690-7c45-4846-9f75-6ec3cb4bc9ba\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-sbwld" Feb 24 10:12:10 crc kubenswrapper[4985]: I0224 10:12:10.405157 4985 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/d98f046c-a925-4946-9571-4aba6008f4b5-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-t79l8\" (UID: \"d98f046c-a925-4946-9571-4aba6008f4b5\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-t79l8" Feb 24 10:12:10 crc kubenswrapper[4985]: I0224 10:12:10.405167 4985 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/8905bada-0871-41b5-9a21-83e4d3884edf-image-import-ca\") pod \"apiserver-76f77b778f-nnr7f\" (UID: \"8905bada-0871-41b5-9a21-83e4d3884edf\") " pod="openshift-apiserver/apiserver-76f77b778f-nnr7f" Feb 24 10:12:10 crc kubenswrapper[4985]: I0224 10:12:10.405176 4985 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/d93e05b9-4ed8-4cd6-a961-25cf43ed4cf9-etcd-client\") pod \"apiserver-7bbb656c7d-jlxbx\" (UID: \"d93e05b9-4ed8-4cd6-a961-25cf43ed4cf9\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-jlxbx" Feb 24 10:12:10 crc kubenswrapper[4985]: I0224 10:12:10.405237 4985 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/efdfb8c2-12c0-4134-a05d-f7c880e70649-serving-cert\") pod \"authentication-operator-69f744f599-bl6z5\" (UID: \"efdfb8c2-12c0-4134-a05d-f7c880e70649\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-bl6z5" Feb 24 10:12:10 crc kubenswrapper[4985]: I0224 10:12:10.405786 4985 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/d93e05b9-4ed8-4cd6-a961-25cf43ed4cf9-audit-policies\") pod \"apiserver-7bbb656c7d-jlxbx\" (UID: \"d93e05b9-4ed8-4cd6-a961-25cf43ed4cf9\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-jlxbx" Feb 24 10:12:10 crc kubenswrapper[4985]: I0224 10:12:10.406296 4985 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/8905bada-0871-41b5-9a21-83e4d3884edf-node-pullsecrets\") pod \"apiserver-76f77b778f-nnr7f\" (UID: \"8905bada-0871-41b5-9a21-83e4d3884edf\") " pod="openshift-apiserver/apiserver-76f77b778f-nnr7f" Feb 24 10:12:10 crc kubenswrapper[4985]: I0224 10:12:10.407000 4985 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bbf9fdf8-7dd4-4bb2-9ff7-0349600747f8-config\") pod \"controller-manager-879f6c89f-nncfz\" (UID: \"bbf9fdf8-7dd4-4bb2-9ff7-0349600747f8\") " pod="openshift-controller-manager/controller-manager-879f6c89f-nncfz" Feb 24 10:12:10 crc kubenswrapper[4985]: I0224 10:12:10.399805 4985 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-jhlml"] Feb 24 10:12:10 crc kubenswrapper[4985]: I0224 10:12:10.407097 4985 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/d98f046c-a925-4946-9571-4aba6008f4b5-images\") pod \"machine-api-operator-5694c8668f-t79l8\" (UID: \"d98f046c-a925-4946-9571-4aba6008f4b5\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-t79l8" Feb 24 10:12:10 crc kubenswrapper[4985]: I0224 10:12:10.407569 4985 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-jhlml" Feb 24 10:12:10 crc kubenswrapper[4985]: I0224 10:12:10.407606 4985 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/8905bada-0871-41b5-9a21-83e4d3884edf-etcd-serving-ca\") pod \"apiserver-76f77b778f-nnr7f\" (UID: \"8905bada-0871-41b5-9a21-83e4d3884edf\") " pod="openshift-apiserver/apiserver-76f77b778f-nnr7f" Feb 24 10:12:10 crc kubenswrapper[4985]: I0224 10:12:10.407764 4985 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/d93e05b9-4ed8-4cd6-a961-25cf43ed4cf9-audit-dir\") pod \"apiserver-7bbb656c7d-jlxbx\" (UID: \"d93e05b9-4ed8-4cd6-a961-25cf43ed4cf9\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-jlxbx" Feb 24 10:12:10 crc kubenswrapper[4985]: I0224 10:12:10.408108 4985 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-ktpsr" Feb 24 10:12:10 crc kubenswrapper[4985]: I0224 10:12:10.408232 4985 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8905bada-0871-41b5-9a21-83e4d3884edf-trusted-ca-bundle\") pod \"apiserver-76f77b778f-nnr7f\" (UID: \"8905bada-0871-41b5-9a21-83e4d3884edf\") " pod="openshift-apiserver/apiserver-76f77b778f-nnr7f" Feb 24 10:12:10 crc kubenswrapper[4985]: I0224 10:12:10.408450 4985 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/bbf9fdf8-7dd4-4bb2-9ff7-0349600747f8-client-ca\") pod \"controller-manager-879f6c89f-nncfz\" (UID: \"bbf9fdf8-7dd4-4bb2-9ff7-0349600747f8\") " pod="openshift-controller-manager/controller-manager-879f6c89f-nncfz" Feb 24 10:12:10 crc kubenswrapper[4985]: I0224 10:12:10.408784 4985 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-zjjx8" Feb 24 10:12:10 crc kubenswrapper[4985]: I0224 10:12:10.409223 4985 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/8905bada-0871-41b5-9a21-83e4d3884edf-audit-dir\") pod \"apiserver-76f77b778f-nnr7f\" (UID: \"8905bada-0871-41b5-9a21-83e4d3884edf\") " pod="openshift-apiserver/apiserver-76f77b778f-nnr7f" Feb 24 10:12:10 crc kubenswrapper[4985]: I0224 10:12:10.409573 4985 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/20592ddc-38f3-4407-a5c3-b719b89cd42e-config\") pod \"route-controller-manager-6576b87f9c-t527v\" (UID: \"20592ddc-38f3-4407-a5c3-b719b89cd42e\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-t527v" Feb 24 10:12:10 crc kubenswrapper[4985]: I0224 10:12:10.410043 4985 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/8905bada-0871-41b5-9a21-83e4d3884edf-audit\") pod \"apiserver-76f77b778f-nnr7f\" (UID: \"8905bada-0871-41b5-9a21-83e4d3884edf\") " pod="openshift-apiserver/apiserver-76f77b778f-nnr7f" Feb 24 10:12:10 crc kubenswrapper[4985]: I0224 10:12:10.410209 4985 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/8c5e6fc2-42c9-4794-91cc-1f74adf686db-console-config\") pod \"console-f9d7485db-d7v4s\" (UID: \"8c5e6fc2-42c9-4794-91cc-1f74adf686db\") " pod="openshift-console/console-f9d7485db-d7v4s" Feb 24 10:12:10 crc kubenswrapper[4985]: I0224 10:12:10.410759 4985 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/20592ddc-38f3-4407-a5c3-b719b89cd42e-client-ca\") pod \"route-controller-manager-6576b87f9c-t527v\" (UID: \"20592ddc-38f3-4407-a5c3-b719b89cd42e\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-t527v" Feb 24 10:12:10 crc kubenswrapper[4985]: I0224 10:12:10.411326 4985 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Feb 24 10:12:10 crc kubenswrapper[4985]: I0224 10:12:10.411673 4985 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Feb 24 10:12:10 crc kubenswrapper[4985]: I0224 10:12:10.411935 4985 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Feb 24 10:12:10 crc kubenswrapper[4985]: I0224 10:12:10.412047 4985 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Feb 24 10:12:10 crc kubenswrapper[4985]: I0224 10:12:10.412149 4985 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Feb 24 10:12:10 crc kubenswrapper[4985]: I0224 10:12:10.419736 4985 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bbf9fdf8-7dd4-4bb2-9ff7-0349600747f8-serving-cert\") pod \"controller-manager-879f6c89f-nncfz\" (UID: \"bbf9fdf8-7dd4-4bb2-9ff7-0349600747f8\") " pod="openshift-controller-manager/controller-manager-879f6c89f-nncfz" Feb 24 10:12:10 crc kubenswrapper[4985]: I0224 10:12:10.421570 4985 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/d98f046c-a925-4946-9571-4aba6008f4b5-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-t79l8\" (UID: \"d98f046c-a925-4946-9571-4aba6008f4b5\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-t79l8" Feb 24 10:12:10 crc kubenswrapper[4985]: I0224 10:12:10.425156 4985 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-t8w87"] Feb 24 10:12:10 crc kubenswrapper[4985]: I0224 10:12:10.425695 4985 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-t8w87" Feb 24 10:12:10 crc kubenswrapper[4985]: I0224 10:12:10.444326 4985 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d93e05b9-4ed8-4cd6-a961-25cf43ed4cf9-serving-cert\") pod \"apiserver-7bbb656c7d-jlxbx\" (UID: \"d93e05b9-4ed8-4cd6-a961-25cf43ed4cf9\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-jlxbx" Feb 24 10:12:10 crc kubenswrapper[4985]: I0224 10:12:10.444951 4985 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/8905bada-0871-41b5-9a21-83e4d3884edf-etcd-client\") pod \"apiserver-76f77b778f-nnr7f\" (UID: \"8905bada-0871-41b5-9a21-83e4d3884edf\") " pod="openshift-apiserver/apiserver-76f77b778f-nnr7f" Feb 24 10:12:10 crc kubenswrapper[4985]: I0224 10:12:10.445509 4985 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/20592ddc-38f3-4407-a5c3-b719b89cd42e-serving-cert\") pod \"route-controller-manager-6576b87f9c-t527v\" (UID: \"20592ddc-38f3-4407-a5c3-b719b89cd42e\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-t527v" Feb 24 10:12:10 crc kubenswrapper[4985]: I0224 10:12:10.446029 4985 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/8c5e6fc2-42c9-4794-91cc-1f74adf686db-console-oauth-config\") pod \"console-f9d7485db-d7v4s\" (UID: \"8c5e6fc2-42c9-4794-91cc-1f74adf686db\") " pod="openshift-console/console-f9d7485db-d7v4s" Feb 24 10:12:10 crc kubenswrapper[4985]: I0224 10:12:10.446991 4985 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/4958ed59-9dc5-4e53-a250-feb5516c6c97-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-z6mz6\" (UID: \"4958ed59-9dc5-4e53-a250-feb5516c6c97\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-z6mz6" Feb 24 10:12:10 crc kubenswrapper[4985]: I0224 10:12:10.448702 4985 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8905bada-0871-41b5-9a21-83e4d3884edf-serving-cert\") pod \"apiserver-76f77b778f-nnr7f\" (UID: \"8905bada-0871-41b5-9a21-83e4d3884edf\") " pod="openshift-apiserver/apiserver-76f77b778f-nnr7f" Feb 24 10:12:10 crc kubenswrapper[4985]: I0224 10:12:10.449782 4985 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/d93e05b9-4ed8-4cd6-a961-25cf43ed4cf9-etcd-client\") pod \"apiserver-7bbb656c7d-jlxbx\" (UID: \"d93e05b9-4ed8-4cd6-a961-25cf43ed4cf9\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-jlxbx" Feb 24 10:12:10 crc kubenswrapper[4985]: I0224 10:12:10.450741 4985 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/d93e05b9-4ed8-4cd6-a961-25cf43ed4cf9-encryption-config\") pod \"apiserver-7bbb656c7d-jlxbx\" (UID: \"d93e05b9-4ed8-4cd6-a961-25cf43ed4cf9\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-jlxbx" Feb 24 10:12:10 crc kubenswrapper[4985]: I0224 10:12:10.444265 4985 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/8c5e6fc2-42c9-4794-91cc-1f74adf686db-console-serving-cert\") pod \"console-f9d7485db-d7v4s\" (UID: \"8c5e6fc2-42c9-4794-91cc-1f74adf686db\") " pod="openshift-console/console-f9d7485db-d7v4s" Feb 24 10:12:10 crc kubenswrapper[4985]: I0224 10:12:10.456248 4985 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-25d4p"] Feb 24 10:12:10 crc kubenswrapper[4985]: I0224 10:12:10.456420 4985 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/8905bada-0871-41b5-9a21-83e4d3884edf-encryption-config\") pod \"apiserver-76f77b778f-nnr7f\" (UID: \"8905bada-0871-41b5-9a21-83e4d3884edf\") " pod="openshift-apiserver/apiserver-76f77b778f-nnr7f" Feb 24 10:12:10 crc kubenswrapper[4985]: I0224 10:12:10.457817 4985 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Feb 24 10:12:10 crc kubenswrapper[4985]: I0224 10:12:10.459603 4985 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Feb 24 10:12:10 crc kubenswrapper[4985]: I0224 10:12:10.459783 4985 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Feb 24 10:12:10 crc kubenswrapper[4985]: I0224 10:12:10.461238 4985 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Feb 24 10:12:10 crc kubenswrapper[4985]: I0224 10:12:10.487987 4985 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Feb 24 10:12:10 crc kubenswrapper[4985]: I0224 10:12:10.488417 4985 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Feb 24 10:12:10 crc kubenswrapper[4985]: I0224 10:12:10.461172 4985 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Feb 24 10:12:10 crc kubenswrapper[4985]: I0224 10:12:10.488825 4985 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Feb 24 10:12:10 crc kubenswrapper[4985]: I0224 10:12:10.490253 4985 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Feb 24 10:12:10 crc kubenswrapper[4985]: I0224 10:12:10.490949 4985 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-lwqzt"] Feb 24 10:12:10 crc kubenswrapper[4985]: I0224 10:12:10.491410 4985 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-5444994796-gcwfl"] Feb 24 10:12:10 crc kubenswrapper[4985]: I0224 10:12:10.491673 4985 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-prkmg"] Feb 24 10:12:10 crc kubenswrapper[4985]: I0224 10:12:10.491969 4985 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-2hv9t"] Feb 24 10:12:10 crc kubenswrapper[4985]: I0224 10:12:10.491974 4985 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-lwqzt" Feb 24 10:12:10 crc kubenswrapper[4985]: I0224 10:12:10.492117 4985 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-gcwfl" Feb 24 10:12:10 crc kubenswrapper[4985]: I0224 10:12:10.491676 4985 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Feb 24 10:12:10 crc kubenswrapper[4985]: I0224 10:12:10.492306 4985 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-2hv9t" Feb 24 10:12:10 crc kubenswrapper[4985]: I0224 10:12:10.492018 4985 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Feb 24 10:12:10 crc kubenswrapper[4985]: I0224 10:12:10.492509 4985 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-prkmg" Feb 24 10:12:10 crc kubenswrapper[4985]: I0224 10:12:10.492773 4985 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-25d4p" Feb 24 10:12:10 crc kubenswrapper[4985]: I0224 10:12:10.493204 4985 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Feb 24 10:12:10 crc kubenswrapper[4985]: I0224 10:12:10.493497 4985 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Feb 24 10:12:10 crc kubenswrapper[4985]: I0224 10:12:10.494500 4985 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/bbf9fdf8-7dd4-4bb2-9ff7-0349600747f8-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-nncfz\" (UID: \"bbf9fdf8-7dd4-4bb2-9ff7-0349600747f8\") " pod="openshift-controller-manager/controller-manager-879f6c89f-nncfz" Feb 24 10:12:10 crc kubenswrapper[4985]: I0224 10:12:10.494611 4985 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Feb 24 10:12:10 crc kubenswrapper[4985]: I0224 10:12:10.495767 4985 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-ggvx6"] Feb 24 10:12:10 crc kubenswrapper[4985]: I0224 10:12:10.496485 4985 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Feb 24 10:12:10 crc kubenswrapper[4985]: I0224 10:12:10.496585 4985 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-ggvx6" Feb 24 10:12:10 crc kubenswrapper[4985]: I0224 10:12:10.497313 4985 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-4hthm"] Feb 24 10:12:10 crc kubenswrapper[4985]: I0224 10:12:10.497829 4985 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-4hthm" Feb 24 10:12:10 crc kubenswrapper[4985]: I0224 10:12:10.499729 4985 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Feb 24 10:12:10 crc kubenswrapper[4985]: I0224 10:12:10.500130 4985 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-2hxbt"] Feb 24 10:12:10 crc kubenswrapper[4985]: I0224 10:12:10.500981 4985 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-2hxbt" Feb 24 10:12:10 crc kubenswrapper[4985]: I0224 10:12:10.501619 4985 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Feb 24 10:12:10 crc kubenswrapper[4985]: I0224 10:12:10.501873 4985 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Feb 24 10:12:10 crc kubenswrapper[4985]: I0224 10:12:10.503034 4985 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6lthw\" (UniqueName: \"kubernetes.io/projected/8c5e6fc2-42c9-4794-91cc-1f74adf686db-kube-api-access-6lthw\") pod \"console-f9d7485db-d7v4s\" (UID: \"8c5e6fc2-42c9-4794-91cc-1f74adf686db\") " pod="openshift-console/console-f9d7485db-d7v4s" Feb 24 10:12:10 crc kubenswrapper[4985]: I0224 10:12:10.503453 4985 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8c5e6fc2-42c9-4794-91cc-1f74adf686db-trusted-ca-bundle\") pod \"console-f9d7485db-d7v4s\" (UID: \"8c5e6fc2-42c9-4794-91cc-1f74adf686db\") " pod="openshift-console/console-f9d7485db-d7v4s" Feb 24 10:12:10 crc kubenswrapper[4985]: I0224 10:12:10.506246 4985 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/ee57ec8e-3901-4355-b744-5ed2eeb20c9d-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-xh5q8\" (UID: \"ee57ec8e-3901-4355-b744-5ed2eeb20c9d\") " pod="openshift-authentication/oauth-openshift-558db77b4-xh5q8" Feb 24 10:12:10 crc kubenswrapper[4985]: I0224 10:12:10.506302 4985 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-trlvj\" (UniqueName: \"kubernetes.io/projected/d8dbe690-7c45-4846-9f75-6ec3cb4bc9ba-kube-api-access-trlvj\") pod \"machine-approver-56656f9798-sbwld\" (UID: \"d8dbe690-7c45-4846-9f75-6ec3cb4bc9ba\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-sbwld" Feb 24 10:12:10 crc kubenswrapper[4985]: I0224 10:12:10.506335 4985 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/ee57ec8e-3901-4355-b744-5ed2eeb20c9d-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-xh5q8\" (UID: \"ee57ec8e-3901-4355-b744-5ed2eeb20c9d\") " pod="openshift-authentication/oauth-openshift-558db77b4-xh5q8" Feb 24 10:12:10 crc kubenswrapper[4985]: I0224 10:12:10.506352 4985 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/ee57ec8e-3901-4355-b744-5ed2eeb20c9d-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-xh5q8\" (UID: \"ee57ec8e-3901-4355-b744-5ed2eeb20c9d\") " pod="openshift-authentication/oauth-openshift-558db77b4-xh5q8" Feb 24 10:12:10 crc kubenswrapper[4985]: I0224 10:12:10.506368 4985 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/ee57ec8e-3901-4355-b744-5ed2eeb20c9d-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-xh5q8\" (UID: \"ee57ec8e-3901-4355-b744-5ed2eeb20c9d\") " pod="openshift-authentication/oauth-openshift-558db77b4-xh5q8" Feb 24 10:12:10 crc kubenswrapper[4985]: I0224 10:12:10.506386 4985 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/efdfb8c2-12c0-4134-a05d-f7c880e70649-serving-cert\") pod \"authentication-operator-69f744f599-bl6z5\" (UID: \"efdfb8c2-12c0-4134-a05d-f7c880e70649\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-bl6z5" Feb 24 10:12:10 crc kubenswrapper[4985]: I0224 10:12:10.506407 4985 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vhb6t\" (UniqueName: \"kubernetes.io/projected/bce7d3e8-b855-43ae-8527-fbc14ac50521-kube-api-access-vhb6t\") pod \"downloads-7954f5f757-zjjx8\" (UID: \"bce7d3e8-b855-43ae-8527-fbc14ac50521\") " pod="openshift-console/downloads-7954f5f757-zjjx8" Feb 24 10:12:10 crc kubenswrapper[4985]: I0224 10:12:10.506425 4985 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/751d5065-e3f3-4fe5-9d24-ca6c7197d0d6-metrics-tls\") pod \"dns-operator-744455d44c-t8w87\" (UID: \"751d5065-e3f3-4fe5-9d24-ca6c7197d0d6\") " pod="openshift-dns-operator/dns-operator-744455d44c-t8w87" Feb 24 10:12:10 crc kubenswrapper[4985]: I0224 10:12:10.506446 4985 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qvdrz\" (UniqueName: \"kubernetes.io/projected/5dba6bb5-c7aa-45b0-82b4-990febac8e99-kube-api-access-qvdrz\") pod \"openshift-config-operator-7777fb866f-n8mff\" (UID: \"5dba6bb5-c7aa-45b0-82b4-990febac8e99\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-n8mff" Feb 24 10:12:10 crc kubenswrapper[4985]: I0224 10:12:10.506478 4985 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/175055a6-4d1f-47c1-8947-287934749c2b-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-cs28d\" (UID: \"175055a6-4d1f-47c1-8947-287934749c2b\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-cs28d" Feb 24 10:12:10 crc kubenswrapper[4985]: I0224 10:12:10.506498 4985 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/cab7ac62-05fe-4c59-90f8-7ca1297d2cd0-etcd-service-ca\") pod \"etcd-operator-b45778765-jhlml\" (UID: \"cab7ac62-05fe-4c59-90f8-7ca1297d2cd0\") " pod="openshift-etcd-operator/etcd-operator-b45778765-jhlml" Feb 24 10:12:10 crc kubenswrapper[4985]: I0224 10:12:10.506537 4985 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/efdfb8c2-12c0-4134-a05d-f7c880e70649-service-ca-bundle\") pod \"authentication-operator-69f744f599-bl6z5\" (UID: \"efdfb8c2-12c0-4134-a05d-f7c880e70649\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-bl6z5" Feb 24 10:12:10 crc kubenswrapper[4985]: I0224 10:12:10.506554 4985 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8d200ef1-f292-460a-a971-1ac47d688fb1-config\") pod \"console-operator-58897d9998-v8tln\" (UID: \"8d200ef1-f292-460a-a971-1ac47d688fb1\") " pod="openshift-console-operator/console-operator-58897d9998-v8tln" Feb 24 10:12:10 crc kubenswrapper[4985]: I0224 10:12:10.506570 4985 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/ee57ec8e-3901-4355-b744-5ed2eeb20c9d-audit-policies\") pod \"oauth-openshift-558db77b4-xh5q8\" (UID: \"ee57ec8e-3901-4355-b744-5ed2eeb20c9d\") " pod="openshift-authentication/oauth-openshift-558db77b4-xh5q8" Feb 24 10:12:10 crc kubenswrapper[4985]: I0224 10:12:10.506587 4985 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/175055a6-4d1f-47c1-8947-287934749c2b-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-cs28d\" (UID: \"175055a6-4d1f-47c1-8947-287934749c2b\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-cs28d" Feb 24 10:12:10 crc kubenswrapper[4985]: I0224 10:12:10.506604 4985 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/cab7ac62-05fe-4c59-90f8-7ca1297d2cd0-etcd-client\") pod \"etcd-operator-b45778765-jhlml\" (UID: \"cab7ac62-05fe-4c59-90f8-7ca1297d2cd0\") " pod="openshift-etcd-operator/etcd-operator-b45778765-jhlml" Feb 24 10:12:10 crc kubenswrapper[4985]: I0224 10:12:10.506625 4985 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/ee57ec8e-3901-4355-b744-5ed2eeb20c9d-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-xh5q8\" (UID: \"ee57ec8e-3901-4355-b744-5ed2eeb20c9d\") " pod="openshift-authentication/oauth-openshift-558db77b4-xh5q8" Feb 24 10:12:10 crc kubenswrapper[4985]: I0224 10:12:10.506648 4985 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4vnbg\" (UniqueName: \"kubernetes.io/projected/175055a6-4d1f-47c1-8947-287934749c2b-kube-api-access-4vnbg\") pod \"openshift-controller-manager-operator-756b6f6bc6-cs28d\" (UID: \"175055a6-4d1f-47c1-8947-287934749c2b\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-cs28d" Feb 24 10:12:10 crc kubenswrapper[4985]: I0224 10:12:10.506671 4985 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8mtzk\" (UniqueName: \"kubernetes.io/projected/8d200ef1-f292-460a-a971-1ac47d688fb1-kube-api-access-8mtzk\") pod \"console-operator-58897d9998-v8tln\" (UID: \"8d200ef1-f292-460a-a971-1ac47d688fb1\") " pod="openshift-console-operator/console-operator-58897d9998-v8tln" Feb 24 10:12:10 crc kubenswrapper[4985]: I0224 10:12:10.506725 4985 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/ee57ec8e-3901-4355-b744-5ed2eeb20c9d-audit-dir\") pod \"oauth-openshift-558db77b4-xh5q8\" (UID: \"ee57ec8e-3901-4355-b744-5ed2eeb20c9d\") " pod="openshift-authentication/oauth-openshift-558db77b4-xh5q8" Feb 24 10:12:10 crc kubenswrapper[4985]: I0224 10:12:10.506740 4985 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/efdfb8c2-12c0-4134-a05d-f7c880e70649-config\") pod \"authentication-operator-69f744f599-bl6z5\" (UID: \"efdfb8c2-12c0-4134-a05d-f7c880e70649\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-bl6z5" Feb 24 10:12:10 crc kubenswrapper[4985]: I0224 10:12:10.506767 4985 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jbd8d\" (UniqueName: \"kubernetes.io/projected/751d5065-e3f3-4fe5-9d24-ca6c7197d0d6-kube-api-access-jbd8d\") pod \"dns-operator-744455d44c-t8w87\" (UID: \"751d5065-e3f3-4fe5-9d24-ca6c7197d0d6\") " pod="openshift-dns-operator/dns-operator-744455d44c-t8w87" Feb 24 10:12:10 crc kubenswrapper[4985]: I0224 10:12:10.506793 4985 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/ee57ec8e-3901-4355-b744-5ed2eeb20c9d-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-xh5q8\" (UID: \"ee57ec8e-3901-4355-b744-5ed2eeb20c9d\") " pod="openshift-authentication/oauth-openshift-558db77b4-xh5q8" Feb 24 10:12:10 crc kubenswrapper[4985]: I0224 10:12:10.506815 4985 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ee57ec8e-3901-4355-b744-5ed2eeb20c9d-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-xh5q8\" (UID: \"ee57ec8e-3901-4355-b744-5ed2eeb20c9d\") " pod="openshift-authentication/oauth-openshift-558db77b4-xh5q8" Feb 24 10:12:10 crc kubenswrapper[4985]: I0224 10:12:10.506856 4985 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d8hbp\" (UniqueName: \"kubernetes.io/projected/cab7ac62-05fe-4c59-90f8-7ca1297d2cd0-kube-api-access-d8hbp\") pod \"etcd-operator-b45778765-jhlml\" (UID: \"cab7ac62-05fe-4c59-90f8-7ca1297d2cd0\") " pod="openshift-etcd-operator/etcd-operator-b45778765-jhlml" Feb 24 10:12:10 crc kubenswrapper[4985]: I0224 10:12:10.506883 4985 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/ee57ec8e-3901-4355-b744-5ed2eeb20c9d-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-xh5q8\" (UID: \"ee57ec8e-3901-4355-b744-5ed2eeb20c9d\") " pod="openshift-authentication/oauth-openshift-558db77b4-xh5q8" Feb 24 10:12:10 crc kubenswrapper[4985]: I0224 10:12:10.506921 4985 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qss5w\" (UniqueName: \"kubernetes.io/projected/ee57ec8e-3901-4355-b744-5ed2eeb20c9d-kube-api-access-qss5w\") pod \"oauth-openshift-558db77b4-xh5q8\" (UID: \"ee57ec8e-3901-4355-b744-5ed2eeb20c9d\") " pod="openshift-authentication/oauth-openshift-558db77b4-xh5q8" Feb 24 10:12:10 crc kubenswrapper[4985]: I0224 10:12:10.506937 4985 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cab7ac62-05fe-4c59-90f8-7ca1297d2cd0-config\") pod \"etcd-operator-b45778765-jhlml\" (UID: \"cab7ac62-05fe-4c59-90f8-7ca1297d2cd0\") " pod="openshift-etcd-operator/etcd-operator-b45778765-jhlml" Feb 24 10:12:10 crc kubenswrapper[4985]: I0224 10:12:10.506984 4985 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/5dba6bb5-c7aa-45b0-82b4-990febac8e99-available-featuregates\") pod \"openshift-config-operator-7777fb866f-n8mff\" (UID: \"5dba6bb5-c7aa-45b0-82b4-990febac8e99\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-n8mff" Feb 24 10:12:10 crc kubenswrapper[4985]: I0224 10:12:10.507003 4985 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8d200ef1-f292-460a-a971-1ac47d688fb1-serving-cert\") pod \"console-operator-58897d9998-v8tln\" (UID: \"8d200ef1-f292-460a-a971-1ac47d688fb1\") " pod="openshift-console-operator/console-operator-58897d9998-v8tln" Feb 24 10:12:10 crc kubenswrapper[4985]: I0224 10:12:10.507035 4985 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d8dbe690-7c45-4846-9f75-6ec3cb4bc9ba-config\") pod \"machine-approver-56656f9798-sbwld\" (UID: \"d8dbe690-7c45-4846-9f75-6ec3cb4bc9ba\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-sbwld" Feb 24 10:12:10 crc kubenswrapper[4985]: I0224 10:12:10.507072 4985 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/ee57ec8e-3901-4355-b744-5ed2eeb20c9d-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-xh5q8\" (UID: \"ee57ec8e-3901-4355-b744-5ed2eeb20c9d\") " pod="openshift-authentication/oauth-openshift-558db77b4-xh5q8" Feb 24 10:12:10 crc kubenswrapper[4985]: I0224 10:12:10.507087 4985 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/efdfb8c2-12c0-4134-a05d-f7c880e70649-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-bl6z5\" (UID: \"efdfb8c2-12c0-4134-a05d-f7c880e70649\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-bl6z5" Feb 24 10:12:10 crc kubenswrapper[4985]: I0224 10:12:10.507103 4985 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/d8dbe690-7c45-4846-9f75-6ec3cb4bc9ba-machine-approver-tls\") pod \"machine-approver-56656f9798-sbwld\" (UID: \"d8dbe690-7c45-4846-9f75-6ec3cb4bc9ba\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-sbwld" Feb 24 10:12:10 crc kubenswrapper[4985]: I0224 10:12:10.507118 4985 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/cab7ac62-05fe-4c59-90f8-7ca1297d2cd0-serving-cert\") pod \"etcd-operator-b45778765-jhlml\" (UID: \"cab7ac62-05fe-4c59-90f8-7ca1297d2cd0\") " pod="openshift-etcd-operator/etcd-operator-b45778765-jhlml" Feb 24 10:12:10 crc kubenswrapper[4985]: I0224 10:12:10.507113 4985 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fbsq2\" (UniqueName: \"kubernetes.io/projected/4958ed59-9dc5-4e53-a250-feb5516c6c97-kube-api-access-fbsq2\") pod \"cluster-samples-operator-665b6dd947-z6mz6\" (UID: \"4958ed59-9dc5-4e53-a250-feb5516c6c97\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-z6mz6" Feb 24 10:12:10 crc kubenswrapper[4985]: I0224 10:12:10.508309 4985 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7tdhc\" (UniqueName: \"kubernetes.io/projected/8905bada-0871-41b5-9a21-83e4d3884edf-kube-api-access-7tdhc\") pod \"apiserver-76f77b778f-nnr7f\" (UID: \"8905bada-0871-41b5-9a21-83e4d3884edf\") " pod="openshift-apiserver/apiserver-76f77b778f-nnr7f" Feb 24 10:12:10 crc kubenswrapper[4985]: I0224 10:12:10.507144 4985 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/d8dbe690-7c45-4846-9f75-6ec3cb4bc9ba-auth-proxy-config\") pod \"machine-approver-56656f9798-sbwld\" (UID: \"d8dbe690-7c45-4846-9f75-6ec3cb4bc9ba\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-sbwld" Feb 24 10:12:10 crc kubenswrapper[4985]: I0224 10:12:10.508523 4985 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/db4512d4-0e5a-467b-85da-2eb3addffa4f-config\") pod \"openshift-apiserver-operator-796bbdcf4f-ktpsr\" (UID: \"db4512d4-0e5a-467b-85da-2eb3addffa4f\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-ktpsr" Feb 24 10:12:10 crc kubenswrapper[4985]: I0224 10:12:10.508654 4985 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/db4512d4-0e5a-467b-85da-2eb3addffa4f-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-ktpsr\" (UID: \"db4512d4-0e5a-467b-85da-2eb3addffa4f\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-ktpsr" Feb 24 10:12:10 crc kubenswrapper[4985]: I0224 10:12:10.508749 4985 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8d200ef1-f292-460a-a971-1ac47d688fb1-trusted-ca\") pod \"console-operator-58897d9998-v8tln\" (UID: \"8d200ef1-f292-460a-a971-1ac47d688fb1\") " pod="openshift-console-operator/console-operator-58897d9998-v8tln" Feb 24 10:12:10 crc kubenswrapper[4985]: I0224 10:12:10.508849 4985 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/ee57ec8e-3901-4355-b744-5ed2eeb20c9d-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-xh5q8\" (UID: \"ee57ec8e-3901-4355-b744-5ed2eeb20c9d\") " pod="openshift-authentication/oauth-openshift-558db77b4-xh5q8" Feb 24 10:12:10 crc kubenswrapper[4985]: I0224 10:12:10.508948 4985 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5dba6bb5-c7aa-45b0-82b4-990febac8e99-serving-cert\") pod \"openshift-config-operator-7777fb866f-n8mff\" (UID: \"5dba6bb5-c7aa-45b0-82b4-990febac8e99\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-n8mff" Feb 24 10:12:10 crc kubenswrapper[4985]: I0224 10:12:10.509009 4985 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/d8dbe690-7c45-4846-9f75-6ec3cb4bc9ba-auth-proxy-config\") pod \"machine-approver-56656f9798-sbwld\" (UID: \"d8dbe690-7c45-4846-9f75-6ec3cb4bc9ba\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-sbwld" Feb 24 10:12:10 crc kubenswrapper[4985]: I0224 10:12:10.509085 4985 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r2mfz\" (UniqueName: \"kubernetes.io/projected/efdfb8c2-12c0-4134-a05d-f7c880e70649-kube-api-access-r2mfz\") pod \"authentication-operator-69f744f599-bl6z5\" (UID: \"efdfb8c2-12c0-4134-a05d-f7c880e70649\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-bl6z5" Feb 24 10:12:10 crc kubenswrapper[4985]: I0224 10:12:10.509159 4985 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/ee57ec8e-3901-4355-b744-5ed2eeb20c9d-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-xh5q8\" (UID: \"ee57ec8e-3901-4355-b744-5ed2eeb20c9d\") " pod="openshift-authentication/oauth-openshift-558db77b4-xh5q8" Feb 24 10:12:10 crc kubenswrapper[4985]: I0224 10:12:10.509232 4985 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/cab7ac62-05fe-4c59-90f8-7ca1297d2cd0-etcd-ca\") pod \"etcd-operator-b45778765-jhlml\" (UID: \"cab7ac62-05fe-4c59-90f8-7ca1297d2cd0\") " pod="openshift-etcd-operator/etcd-operator-b45778765-jhlml" Feb 24 10:12:10 crc kubenswrapper[4985]: I0224 10:12:10.509299 4985 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-46nsb\" (UniqueName: \"kubernetes.io/projected/db4512d4-0e5a-467b-85da-2eb3addffa4f-kube-api-access-46nsb\") pod \"openshift-apiserver-operator-796bbdcf4f-ktpsr\" (UID: \"db4512d4-0e5a-467b-85da-2eb3addffa4f\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-ktpsr" Feb 24 10:12:10 crc kubenswrapper[4985]: I0224 10:12:10.510572 4985 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8d200ef1-f292-460a-a971-1ac47d688fb1-trusted-ca\") pod \"console-operator-58897d9998-v8tln\" (UID: \"8d200ef1-f292-460a-a971-1ac47d688fb1\") " pod="openshift-console-operator/console-operator-58897d9998-v8tln" Feb 24 10:12:10 crc kubenswrapper[4985]: I0224 10:12:10.512799 4985 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d8dbe690-7c45-4846-9f75-6ec3cb4bc9ba-config\") pod \"machine-approver-56656f9798-sbwld\" (UID: \"d8dbe690-7c45-4846-9f75-6ec3cb4bc9ba\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-sbwld" Feb 24 10:12:10 crc kubenswrapper[4985]: I0224 10:12:10.513295 4985 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mxjjr\" (UniqueName: \"kubernetes.io/projected/d98f046c-a925-4946-9571-4aba6008f4b5-kube-api-access-mxjjr\") pod \"machine-api-operator-5694c8668f-t79l8\" (UID: \"d98f046c-a925-4946-9571-4aba6008f4b5\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-t79l8" Feb 24 10:12:10 crc kubenswrapper[4985]: I0224 10:12:10.513429 4985 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5dba6bb5-c7aa-45b0-82b4-990febac8e99-serving-cert\") pod \"openshift-config-operator-7777fb866f-n8mff\" (UID: \"5dba6bb5-c7aa-45b0-82b4-990febac8e99\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-n8mff" Feb 24 10:12:10 crc kubenswrapper[4985]: I0224 10:12:10.513549 4985 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/5dba6bb5-c7aa-45b0-82b4-990febac8e99-available-featuregates\") pod \"openshift-config-operator-7777fb866f-n8mff\" (UID: \"5dba6bb5-c7aa-45b0-82b4-990febac8e99\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-n8mff" Feb 24 10:12:10 crc kubenswrapper[4985]: I0224 10:12:10.514416 4985 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/ee57ec8e-3901-4355-b744-5ed2eeb20c9d-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-xh5q8\" (UID: \"ee57ec8e-3901-4355-b744-5ed2eeb20c9d\") " pod="openshift-authentication/oauth-openshift-558db77b4-xh5q8" Feb 24 10:12:10 crc kubenswrapper[4985]: I0224 10:12:10.514763 4985 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/efdfb8c2-12c0-4134-a05d-f7c880e70649-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-bl6z5\" (UID: \"efdfb8c2-12c0-4134-a05d-f7c880e70649\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-bl6z5" Feb 24 10:12:10 crc kubenswrapper[4985]: I0224 10:12:10.515304 4985 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/ee57ec8e-3901-4355-b744-5ed2eeb20c9d-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-xh5q8\" (UID: \"ee57ec8e-3901-4355-b744-5ed2eeb20c9d\") " pod="openshift-authentication/oauth-openshift-558db77b4-xh5q8" Feb 24 10:12:10 crc kubenswrapper[4985]: I0224 10:12:10.515957 4985 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8d200ef1-f292-460a-a971-1ac47d688fb1-serving-cert\") pod \"console-operator-58897d9998-v8tln\" (UID: \"8d200ef1-f292-460a-a971-1ac47d688fb1\") " pod="openshift-console-operator/console-operator-58897d9998-v8tln" Feb 24 10:12:10 crc kubenswrapper[4985]: I0224 10:12:10.516170 4985 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-qhzl5"] Feb 24 10:12:10 crc kubenswrapper[4985]: I0224 10:12:10.516247 4985 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/efdfb8c2-12c0-4134-a05d-f7c880e70649-service-ca-bundle\") pod \"authentication-operator-69f744f599-bl6z5\" (UID: \"efdfb8c2-12c0-4134-a05d-f7c880e70649\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-bl6z5" Feb 24 10:12:10 crc kubenswrapper[4985]: I0224 10:12:10.516459 4985 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/ee57ec8e-3901-4355-b744-5ed2eeb20c9d-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-xh5q8\" (UID: \"ee57ec8e-3901-4355-b744-5ed2eeb20c9d\") " pod="openshift-authentication/oauth-openshift-558db77b4-xh5q8" Feb 24 10:12:10 crc kubenswrapper[4985]: I0224 10:12:10.517061 4985 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-qhzl5" Feb 24 10:12:10 crc kubenswrapper[4985]: I0224 10:12:10.517415 4985 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/ee57ec8e-3901-4355-b744-5ed2eeb20c9d-audit-dir\") pod \"oauth-openshift-558db77b4-xh5q8\" (UID: \"ee57ec8e-3901-4355-b744-5ed2eeb20c9d\") " pod="openshift-authentication/oauth-openshift-558db77b4-xh5q8" Feb 24 10:12:10 crc kubenswrapper[4985]: I0224 10:12:10.517598 4985 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8d200ef1-f292-460a-a971-1ac47d688fb1-config\") pod \"console-operator-58897d9998-v8tln\" (UID: \"8d200ef1-f292-460a-a971-1ac47d688fb1\") " pod="openshift-console-operator/console-operator-58897d9998-v8tln" Feb 24 10:12:10 crc kubenswrapper[4985]: I0224 10:12:10.518470 4985 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/ee57ec8e-3901-4355-b744-5ed2eeb20c9d-audit-policies\") pod \"oauth-openshift-558db77b4-xh5q8\" (UID: \"ee57ec8e-3901-4355-b744-5ed2eeb20c9d\") " pod="openshift-authentication/oauth-openshift-558db77b4-xh5q8" Feb 24 10:12:10 crc kubenswrapper[4985]: I0224 10:12:10.519368 4985 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-nnr7f"] Feb 24 10:12:10 crc kubenswrapper[4985]: I0224 10:12:10.519833 4985 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/ee57ec8e-3901-4355-b744-5ed2eeb20c9d-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-xh5q8\" (UID: \"ee57ec8e-3901-4355-b744-5ed2eeb20c9d\") " pod="openshift-authentication/oauth-openshift-558db77b4-xh5q8" Feb 24 10:12:10 crc kubenswrapper[4985]: I0224 10:12:10.520290 4985 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ee57ec8e-3901-4355-b744-5ed2eeb20c9d-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-xh5q8\" (UID: \"ee57ec8e-3901-4355-b744-5ed2eeb20c9d\") " pod="openshift-authentication/oauth-openshift-558db77b4-xh5q8" Feb 24 10:12:10 crc kubenswrapper[4985]: I0224 10:12:10.522343 4985 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/175055a6-4d1f-47c1-8947-287934749c2b-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-cs28d\" (UID: \"175055a6-4d1f-47c1-8947-287934749c2b\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-cs28d" Feb 24 10:12:10 crc kubenswrapper[4985]: I0224 10:12:10.523431 4985 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/ee57ec8e-3901-4355-b744-5ed2eeb20c9d-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-xh5q8\" (UID: \"ee57ec8e-3901-4355-b744-5ed2eeb20c9d\") " pod="openshift-authentication/oauth-openshift-558db77b4-xh5q8" Feb 24 10:12:10 crc kubenswrapper[4985]: I0224 10:12:10.523752 4985 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/ee57ec8e-3901-4355-b744-5ed2eeb20c9d-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-xh5q8\" (UID: \"ee57ec8e-3901-4355-b744-5ed2eeb20c9d\") " pod="openshift-authentication/oauth-openshift-558db77b4-xh5q8" Feb 24 10:12:10 crc kubenswrapper[4985]: I0224 10:12:10.523975 4985 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/ee57ec8e-3901-4355-b744-5ed2eeb20c9d-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-xh5q8\" (UID: \"ee57ec8e-3901-4355-b744-5ed2eeb20c9d\") " pod="openshift-authentication/oauth-openshift-558db77b4-xh5q8" Feb 24 10:12:10 crc kubenswrapper[4985]: I0224 10:12:10.526182 4985 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-ww7q7"] Feb 24 10:12:10 crc kubenswrapper[4985]: I0224 10:12:10.526981 4985 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-ww7q7" Feb 24 10:12:10 crc kubenswrapper[4985]: I0224 10:12:10.527254 4985 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/efdfb8c2-12c0-4134-a05d-f7c880e70649-config\") pod \"authentication-operator-69f744f599-bl6z5\" (UID: \"efdfb8c2-12c0-4134-a05d-f7c880e70649\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-bl6z5" Feb 24 10:12:10 crc kubenswrapper[4985]: I0224 10:12:10.526996 4985 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-pswjd"] Feb 24 10:12:10 crc kubenswrapper[4985]: I0224 10:12:10.527845 4985 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-pswjd" Feb 24 10:12:10 crc kubenswrapper[4985]: I0224 10:12:10.528244 4985 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/d8dbe690-7c45-4846-9f75-6ec3cb4bc9ba-machine-approver-tls\") pod \"machine-approver-56656f9798-sbwld\" (UID: \"d8dbe690-7c45-4846-9f75-6ec3cb4bc9ba\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-sbwld" Feb 24 10:12:10 crc kubenswrapper[4985]: I0224 10:12:10.528337 4985 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-rlq9j"] Feb 24 10:12:10 crc kubenswrapper[4985]: I0224 10:12:10.528774 4985 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-rlq9j" Feb 24 10:12:10 crc kubenswrapper[4985]: I0224 10:12:10.529172 4985 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-2xfms"] Feb 24 10:12:10 crc kubenswrapper[4985]: I0224 10:12:10.529864 4985 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-2xfms" Feb 24 10:12:10 crc kubenswrapper[4985]: I0224 10:12:10.531384 4985 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/ee57ec8e-3901-4355-b744-5ed2eeb20c9d-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-xh5q8\" (UID: \"ee57ec8e-3901-4355-b744-5ed2eeb20c9d\") " pod="openshift-authentication/oauth-openshift-558db77b4-xh5q8" Feb 24 10:12:10 crc kubenswrapper[4985]: I0224 10:12:10.532163 4985 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-mlncn"] Feb 24 10:12:10 crc kubenswrapper[4985]: I0224 10:12:10.542107 4985 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/175055a6-4d1f-47c1-8947-287934749c2b-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-cs28d\" (UID: \"175055a6-4d1f-47c1-8947-287934749c2b\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-cs28d" Feb 24 10:12:10 crc kubenswrapper[4985]: I0224 10:12:10.542118 4985 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-2vcf4"] Feb 24 10:12:10 crc kubenswrapper[4985]: I0224 10:12:10.542253 4985 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-mlncn" Feb 24 10:12:10 crc kubenswrapper[4985]: I0224 10:12:10.542292 4985 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/efdfb8c2-12c0-4134-a05d-f7c880e70649-serving-cert\") pod \"authentication-operator-69f744f599-bl6z5\" (UID: \"efdfb8c2-12c0-4134-a05d-f7c880e70649\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-bl6z5" Feb 24 10:12:10 crc kubenswrapper[4985]: I0224 10:12:10.542374 4985 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/ee57ec8e-3901-4355-b744-5ed2eeb20c9d-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-xh5q8\" (UID: \"ee57ec8e-3901-4355-b744-5ed2eeb20c9d\") " pod="openshift-authentication/oauth-openshift-558db77b4-xh5q8" Feb 24 10:12:10 crc kubenswrapper[4985]: I0224 10:12:10.542976 4985 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Feb 24 10:12:10 crc kubenswrapper[4985]: I0224 10:12:10.544100 4985 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/ee57ec8e-3901-4355-b744-5ed2eeb20c9d-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-xh5q8\" (UID: \"ee57ec8e-3901-4355-b744-5ed2eeb20c9d\") " pod="openshift-authentication/oauth-openshift-558db77b4-xh5q8" Feb 24 10:12:10 crc kubenswrapper[4985]: I0224 10:12:10.544154 4985 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-2vcf4" Feb 24 10:12:10 crc kubenswrapper[4985]: I0224 10:12:10.545180 4985 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-mvvxj"] Feb 24 10:12:10 crc kubenswrapper[4985]: I0224 10:12:10.546406 4985 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-mvvxj" Feb 24 10:12:10 crc kubenswrapper[4985]: I0224 10:12:10.548357 4985 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-cfp92"] Feb 24 10:12:10 crc kubenswrapper[4985]: I0224 10:12:10.550426 4985 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-cfp92" Feb 24 10:12:10 crc kubenswrapper[4985]: I0224 10:12:10.550734 4985 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-t79l8"] Feb 24 10:12:10 crc kubenswrapper[4985]: I0224 10:12:10.553279 4985 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-hfkq2"] Feb 24 10:12:10 crc kubenswrapper[4985]: I0224 10:12:10.554926 4985 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-hfkq2" Feb 24 10:12:10 crc kubenswrapper[4985]: I0224 10:12:10.555624 4985 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8skx2\" (UniqueName: \"kubernetes.io/projected/bbf9fdf8-7dd4-4bb2-9ff7-0349600747f8-kube-api-access-8skx2\") pod \"controller-manager-879f6c89f-nncfz\" (UID: \"bbf9fdf8-7dd4-4bb2-9ff7-0349600747f8\") " pod="openshift-controller-manager/controller-manager-879f6c89f-nncfz" Feb 24 10:12:10 crc kubenswrapper[4985]: I0224 10:12:10.560066 4985 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-d7v4s"] Feb 24 10:12:10 crc kubenswrapper[4985]: I0224 10:12:10.565191 4985 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-nncfz"] Feb 24 10:12:10 crc kubenswrapper[4985]: I0224 10:12:10.565991 4985 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-225jm"] Feb 24 10:12:10 crc kubenswrapper[4985]: I0224 10:12:10.567054 4985 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-txrfz"] Feb 24 10:12:10 crc kubenswrapper[4985]: I0224 10:12:10.567844 4985 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29532120-q5v4c"] Feb 24 10:12:10 crc kubenswrapper[4985]: I0224 10:12:10.567969 4985 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-225jm" Feb 24 10:12:10 crc kubenswrapper[4985]: I0224 10:12:10.568444 4985 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-txrfz" Feb 24 10:12:10 crc kubenswrapper[4985]: I0224 10:12:10.568662 4985 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-prkmg"] Feb 24 10:12:10 crc kubenswrapper[4985]: I0224 10:12:10.568782 4985 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29532120-q5v4c" Feb 24 10:12:10 crc kubenswrapper[4985]: I0224 10:12:10.569701 4985 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fd6t2\" (UniqueName: \"kubernetes.io/projected/d93e05b9-4ed8-4cd6-a961-25cf43ed4cf9-kube-api-access-fd6t2\") pod \"apiserver-7bbb656c7d-jlxbx\" (UID: \"d93e05b9-4ed8-4cd6-a961-25cf43ed4cf9\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-jlxbx" Feb 24 10:12:10 crc kubenswrapper[4985]: I0224 10:12:10.571016 4985 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-t527v"] Feb 24 10:12:10 crc kubenswrapper[4985]: I0224 10:12:10.571288 4985 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-t8w87"] Feb 24 10:12:10 crc kubenswrapper[4985]: I0224 10:12:10.572244 4985 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-zjjx8"] Feb 24 10:12:10 crc kubenswrapper[4985]: I0224 10:12:10.573321 4985 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-bl6z5"] Feb 24 10:12:10 crc kubenswrapper[4985]: I0224 10:12:10.574465 4985 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-jlxbx"] Feb 24 10:12:10 crc kubenswrapper[4985]: I0224 10:12:10.575416 4985 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-rmswq"] Feb 24 10:12:10 crc kubenswrapper[4985]: I0224 10:12:10.576317 4985 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Feb 24 10:12:10 crc kubenswrapper[4985]: I0224 10:12:10.576351 4985 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-rmswq" Feb 24 10:12:10 crc kubenswrapper[4985]: I0224 10:12:10.576686 4985 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-xh5q8"] Feb 24 10:12:10 crc kubenswrapper[4985]: I0224 10:12:10.577719 4985 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-jhlml"] Feb 24 10:12:10 crc kubenswrapper[4985]: I0224 10:12:10.579719 4985 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-z6mz6"] Feb 24 10:12:10 crc kubenswrapper[4985]: I0224 10:12:10.580971 4985 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-cs28d"] Feb 24 10:12:10 crc kubenswrapper[4985]: I0224 10:12:10.582176 4985 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-qhzl5"] Feb 24 10:12:10 crc kubenswrapper[4985]: I0224 10:12:10.582975 4985 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-ktpsr"] Feb 24 10:12:10 crc kubenswrapper[4985]: I0224 10:12:10.583946 4985 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-ggvx6"] Feb 24 10:12:10 crc kubenswrapper[4985]: I0224 10:12:10.584963 4985 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-pswjd"] Feb 24 10:12:10 crc kubenswrapper[4985]: I0224 10:12:10.585963 4985 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-4hthm"] Feb 24 10:12:10 crc kubenswrapper[4985]: I0224 10:12:10.586844 4985 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-t79l8" Feb 24 10:12:10 crc kubenswrapper[4985]: I0224 10:12:10.586912 4985 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-2hxbt"] Feb 24 10:12:10 crc kubenswrapper[4985]: I0224 10:12:10.589808 4985 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-w5s2c"] Feb 24 10:12:10 crc kubenswrapper[4985]: I0224 10:12:10.590775 4985 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-w5s2c" Feb 24 10:12:10 crc kubenswrapper[4985]: I0224 10:12:10.593342 4985 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-2hv9t"] Feb 24 10:12:10 crc kubenswrapper[4985]: I0224 10:12:10.596978 4985 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-2xfms"] Feb 24 10:12:10 crc kubenswrapper[4985]: I0224 10:12:10.597756 4985 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-rlq9j"] Feb 24 10:12:10 crc kubenswrapper[4985]: I0224 10:12:10.599866 4985 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-rmswq"] Feb 24 10:12:10 crc kubenswrapper[4985]: I0224 10:12:10.601353 4985 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-mvvxj"] Feb 24 10:12:10 crc kubenswrapper[4985]: I0224 10:12:10.602137 4985 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-v8tln"] Feb 24 10:12:10 crc kubenswrapper[4985]: I0224 10:12:10.603121 4985 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-hfkq2"] Feb 24 10:12:10 crc kubenswrapper[4985]: I0224 10:12:10.604222 4985 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-225jm"] Feb 24 10:12:10 crc kubenswrapper[4985]: I0224 10:12:10.606708 4985 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-lwqzt"] Feb 24 10:12:10 crc kubenswrapper[4985]: I0224 10:12:10.608022 4985 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-w5s2c"] Feb 24 10:12:10 crc kubenswrapper[4985]: I0224 10:12:10.608267 4985 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-nncfz" Feb 24 10:12:10 crc kubenswrapper[4985]: I0224 10:12:10.609652 4985 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-ww7q7"] Feb 24 10:12:10 crc kubenswrapper[4985]: I0224 10:12:10.610220 4985 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cab7ac62-05fe-4c59-90f8-7ca1297d2cd0-config\") pod \"etcd-operator-b45778765-jhlml\" (UID: \"cab7ac62-05fe-4c59-90f8-7ca1297d2cd0\") " pod="openshift-etcd-operator/etcd-operator-b45778765-jhlml" Feb 24 10:12:10 crc kubenswrapper[4985]: I0224 10:12:10.610252 4985 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pnc29\" (UniqueName: \"kubernetes.io/projected/9debae0d-c342-4558-8796-bc9798c297de-kube-api-access-pnc29\") pod \"migrator-59844c95c7-2hxbt\" (UID: \"9debae0d-c342-4558-8796-bc9798c297de\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-2hxbt" Feb 24 10:12:10 crc kubenswrapper[4985]: I0224 10:12:10.610276 4985 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4cc3b979-d591-4106-8874-760925fd10f6-service-ca-bundle\") pod \"router-default-5444994796-gcwfl\" (UID: \"4cc3b979-d591-4106-8874-760925fd10f6\") " pod="openshift-ingress/router-default-5444994796-gcwfl" Feb 24 10:12:10 crc kubenswrapper[4985]: I0224 10:12:10.610306 4985 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/4cc3b979-d591-4106-8874-760925fd10f6-stats-auth\") pod \"router-default-5444994796-gcwfl\" (UID: \"4cc3b979-d591-4106-8874-760925fd10f6\") " pod="openshift-ingress/router-default-5444994796-gcwfl" Feb 24 10:12:10 crc kubenswrapper[4985]: I0224 10:12:10.610325 4985 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e3da134f-45d4-4d12-bc38-87546e0920cd-cert\") pod \"ingress-canary-rmswq\" (UID: \"e3da134f-45d4-4d12-bc38-87546e0920cd\") " pod="openshift-ingress-canary/ingress-canary-rmswq" Feb 24 10:12:10 crc kubenswrapper[4985]: I0224 10:12:10.610351 4985 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/cab7ac62-05fe-4c59-90f8-7ca1297d2cd0-serving-cert\") pod \"etcd-operator-b45778765-jhlml\" (UID: \"cab7ac62-05fe-4c59-90f8-7ca1297d2cd0\") " pod="openshift-etcd-operator/etcd-operator-b45778765-jhlml" Feb 24 10:12:10 crc kubenswrapper[4985]: I0224 10:12:10.610367 4985 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4cc3b979-d591-4106-8874-760925fd10f6-metrics-certs\") pod \"router-default-5444994796-gcwfl\" (UID: \"4cc3b979-d591-4106-8874-760925fd10f6\") " pod="openshift-ingress/router-default-5444994796-gcwfl" Feb 24 10:12:10 crc kubenswrapper[4985]: I0224 10:12:10.610397 4985 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2dc233e3-9d26-4ec2-957f-350400af638e-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-prkmg\" (UID: \"2dc233e3-9d26-4ec2-957f-350400af638e\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-prkmg" Feb 24 10:12:10 crc kubenswrapper[4985]: I0224 10:12:10.610428 4985 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/db4512d4-0e5a-467b-85da-2eb3addffa4f-config\") pod \"openshift-apiserver-operator-796bbdcf4f-ktpsr\" (UID: \"db4512d4-0e5a-467b-85da-2eb3addffa4f\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-ktpsr" Feb 24 10:12:10 crc kubenswrapper[4985]: I0224 10:12:10.610447 4985 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/db4512d4-0e5a-467b-85da-2eb3addffa4f-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-ktpsr\" (UID: \"db4512d4-0e5a-467b-85da-2eb3addffa4f\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-ktpsr" Feb 24 10:12:10 crc kubenswrapper[4985]: I0224 10:12:10.610475 4985 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/cab7ac62-05fe-4c59-90f8-7ca1297d2cd0-etcd-ca\") pod \"etcd-operator-b45778765-jhlml\" (UID: \"cab7ac62-05fe-4c59-90f8-7ca1297d2cd0\") " pod="openshift-etcd-operator/etcd-operator-b45778765-jhlml" Feb 24 10:12:10 crc kubenswrapper[4985]: I0224 10:12:10.610495 4985 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-46nsb\" (UniqueName: \"kubernetes.io/projected/db4512d4-0e5a-467b-85da-2eb3addffa4f-kube-api-access-46nsb\") pod \"openshift-apiserver-operator-796bbdcf4f-ktpsr\" (UID: \"db4512d4-0e5a-467b-85da-2eb3addffa4f\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-ktpsr" Feb 24 10:12:10 crc kubenswrapper[4985]: I0224 10:12:10.610522 4985 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z7std\" (UniqueName: \"kubernetes.io/projected/e3da134f-45d4-4d12-bc38-87546e0920cd-kube-api-access-z7std\") pod \"ingress-canary-rmswq\" (UID: \"e3da134f-45d4-4d12-bc38-87546e0920cd\") " pod="openshift-ingress-canary/ingress-canary-rmswq" Feb 24 10:12:10 crc kubenswrapper[4985]: I0224 10:12:10.610559 4985 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vhb6t\" (UniqueName: \"kubernetes.io/projected/bce7d3e8-b855-43ae-8527-fbc14ac50521-kube-api-access-vhb6t\") pod \"downloads-7954f5f757-zjjx8\" (UID: \"bce7d3e8-b855-43ae-8527-fbc14ac50521\") " pod="openshift-console/downloads-7954f5f757-zjjx8" Feb 24 10:12:10 crc kubenswrapper[4985]: I0224 10:12:10.610581 4985 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/78d97920-f891-4f3a-9ccc-b5c10a64e22b-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-qhzl5\" (UID: \"78d97920-f891-4f3a-9ccc-b5c10a64e22b\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-qhzl5" Feb 24 10:12:10 crc kubenswrapper[4985]: I0224 10:12:10.610606 4985 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/751d5065-e3f3-4fe5-9d24-ca6c7197d0d6-metrics-tls\") pod \"dns-operator-744455d44c-t8w87\" (UID: \"751d5065-e3f3-4fe5-9d24-ca6c7197d0d6\") " pod="openshift-dns-operator/dns-operator-744455d44c-t8w87" Feb 24 10:12:10 crc kubenswrapper[4985]: I0224 10:12:10.610624 4985 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/2dc233e3-9d26-4ec2-957f-350400af638e-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-prkmg\" (UID: \"2dc233e3-9d26-4ec2-957f-350400af638e\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-prkmg" Feb 24 10:12:10 crc kubenswrapper[4985]: I0224 10:12:10.610647 4985 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/cab7ac62-05fe-4c59-90f8-7ca1297d2cd0-etcd-service-ca\") pod \"etcd-operator-b45778765-jhlml\" (UID: \"cab7ac62-05fe-4c59-90f8-7ca1297d2cd0\") " pod="openshift-etcd-operator/etcd-operator-b45778765-jhlml" Feb 24 10:12:10 crc kubenswrapper[4985]: I0224 10:12:10.610662 4985 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bbk9r\" (UniqueName: \"kubernetes.io/projected/20592ddc-38f3-4407-a5c3-b719b89cd42e-kube-api-access-bbk9r\") pod \"route-controller-manager-6576b87f9c-t527v\" (UID: \"20592ddc-38f3-4407-a5c3-b719b89cd42e\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-t527v" Feb 24 10:12:10 crc kubenswrapper[4985]: I0224 10:12:10.610683 4985 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/cab7ac62-05fe-4c59-90f8-7ca1297d2cd0-etcd-client\") pod \"etcd-operator-b45778765-jhlml\" (UID: \"cab7ac62-05fe-4c59-90f8-7ca1297d2cd0\") " pod="openshift-etcd-operator/etcd-operator-b45778765-jhlml" Feb 24 10:12:10 crc kubenswrapper[4985]: I0224 10:12:10.610783 4985 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jbd8d\" (UniqueName: \"kubernetes.io/projected/751d5065-e3f3-4fe5-9d24-ca6c7197d0d6-kube-api-access-jbd8d\") pod \"dns-operator-744455d44c-t8w87\" (UID: \"751d5065-e3f3-4fe5-9d24-ca6c7197d0d6\") " pod="openshift-dns-operator/dns-operator-744455d44c-t8w87" Feb 24 10:12:10 crc kubenswrapper[4985]: I0224 10:12:10.610821 4985 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/4cc3b979-d591-4106-8874-760925fd10f6-default-certificate\") pod \"router-default-5444994796-gcwfl\" (UID: \"4cc3b979-d591-4106-8874-760925fd10f6\") " pod="openshift-ingress/router-default-5444994796-gcwfl" Feb 24 10:12:10 crc kubenswrapper[4985]: I0224 10:12:10.610841 4985 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lrnms\" (UniqueName: \"kubernetes.io/projected/4cc3b979-d591-4106-8874-760925fd10f6-kube-api-access-lrnms\") pod \"router-default-5444994796-gcwfl\" (UID: \"4cc3b979-d591-4106-8874-760925fd10f6\") " pod="openshift-ingress/router-default-5444994796-gcwfl" Feb 24 10:12:10 crc kubenswrapper[4985]: I0224 10:12:10.610881 4985 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2dc233e3-9d26-4ec2-957f-350400af638e-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-prkmg\" (UID: \"2dc233e3-9d26-4ec2-957f-350400af638e\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-prkmg" Feb 24 10:12:10 crc kubenswrapper[4985]: I0224 10:12:10.610967 4985 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d8hbp\" (UniqueName: \"kubernetes.io/projected/cab7ac62-05fe-4c59-90f8-7ca1297d2cd0-kube-api-access-d8hbp\") pod \"etcd-operator-b45778765-jhlml\" (UID: \"cab7ac62-05fe-4c59-90f8-7ca1297d2cd0\") " pod="openshift-etcd-operator/etcd-operator-b45778765-jhlml" Feb 24 10:12:10 crc kubenswrapper[4985]: I0224 10:12:10.610990 4985 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7lfgs\" (UniqueName: \"kubernetes.io/projected/78d97920-f891-4f3a-9ccc-b5c10a64e22b-kube-api-access-7lfgs\") pod \"control-plane-machine-set-operator-78cbb6b69f-qhzl5\" (UID: \"78d97920-f891-4f3a-9ccc-b5c10a64e22b\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-qhzl5" Feb 24 10:12:10 crc kubenswrapper[4985]: I0224 10:12:10.611069 4985 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-cfp92"] Feb 24 10:12:10 crc kubenswrapper[4985]: I0224 10:12:10.612581 4985 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/cab7ac62-05fe-4c59-90f8-7ca1297d2cd0-etcd-service-ca\") pod \"etcd-operator-b45778765-jhlml\" (UID: \"cab7ac62-05fe-4c59-90f8-7ca1297d2cd0\") " pod="openshift-etcd-operator/etcd-operator-b45778765-jhlml" Feb 24 10:12:10 crc kubenswrapper[4985]: I0224 10:12:10.612733 4985 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/cab7ac62-05fe-4c59-90f8-7ca1297d2cd0-etcd-ca\") pod \"etcd-operator-b45778765-jhlml\" (UID: \"cab7ac62-05fe-4c59-90f8-7ca1297d2cd0\") " pod="openshift-etcd-operator/etcd-operator-b45778765-jhlml" Feb 24 10:12:10 crc kubenswrapper[4985]: I0224 10:12:10.613344 4985 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cab7ac62-05fe-4c59-90f8-7ca1297d2cd0-config\") pod \"etcd-operator-b45778765-jhlml\" (UID: \"cab7ac62-05fe-4c59-90f8-7ca1297d2cd0\") " pod="openshift-etcd-operator/etcd-operator-b45778765-jhlml" Feb 24 10:12:10 crc kubenswrapper[4985]: I0224 10:12:10.613399 4985 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-n8mff"] Feb 24 10:12:10 crc kubenswrapper[4985]: I0224 10:12:10.613402 4985 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/db4512d4-0e5a-467b-85da-2eb3addffa4f-config\") pod \"openshift-apiserver-operator-796bbdcf4f-ktpsr\" (UID: \"db4512d4-0e5a-467b-85da-2eb3addffa4f\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-ktpsr" Feb 24 10:12:10 crc kubenswrapper[4985]: I0224 10:12:10.614607 4985 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/cab7ac62-05fe-4c59-90f8-7ca1297d2cd0-etcd-client\") pod \"etcd-operator-b45778765-jhlml\" (UID: \"cab7ac62-05fe-4c59-90f8-7ca1297d2cd0\") " pod="openshift-etcd-operator/etcd-operator-b45778765-jhlml" Feb 24 10:12:10 crc kubenswrapper[4985]: I0224 10:12:10.614651 4985 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/db4512d4-0e5a-467b-85da-2eb3addffa4f-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-ktpsr\" (UID: \"db4512d4-0e5a-467b-85da-2eb3addffa4f\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-ktpsr" Feb 24 10:12:10 crc kubenswrapper[4985]: I0224 10:12:10.615622 4985 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-2vcf4"] Feb 24 10:12:10 crc kubenswrapper[4985]: I0224 10:12:10.616042 4985 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Feb 24 10:12:10 crc kubenswrapper[4985]: I0224 10:12:10.616272 4985 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/cab7ac62-05fe-4c59-90f8-7ca1297d2cd0-serving-cert\") pod \"etcd-operator-b45778765-jhlml\" (UID: \"cab7ac62-05fe-4c59-90f8-7ca1297d2cd0\") " pod="openshift-etcd-operator/etcd-operator-b45778765-jhlml" Feb 24 10:12:10 crc kubenswrapper[4985]: I0224 10:12:10.616972 4985 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29532120-q5v4c"] Feb 24 10:12:10 crc kubenswrapper[4985]: I0224 10:12:10.617939 4985 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-mlncn"] Feb 24 10:12:10 crc kubenswrapper[4985]: I0224 10:12:10.619207 4985 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-25d4p"] Feb 24 10:12:10 crc kubenswrapper[4985]: I0224 10:12:10.620235 4985 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-txrfz"] Feb 24 10:12:10 crc kubenswrapper[4985]: I0224 10:12:10.622137 4985 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-n5mkh"] Feb 24 10:12:10 crc kubenswrapper[4985]: I0224 10:12:10.623401 4985 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-n5mkh" Feb 24 10:12:10 crc kubenswrapper[4985]: I0224 10:12:10.623653 4985 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-n5mkh"] Feb 24 10:12:10 crc kubenswrapper[4985]: I0224 10:12:10.624513 4985 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-server-6ssps"] Feb 24 10:12:10 crc kubenswrapper[4985]: I0224 10:12:10.628142 4985 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-6ssps" Feb 24 10:12:10 crc kubenswrapper[4985]: I0224 10:12:10.628743 4985 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-d7v4s" Feb 24 10:12:10 crc kubenswrapper[4985]: I0224 10:12:10.634517 4985 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-jlxbx" Feb 24 10:12:10 crc kubenswrapper[4985]: I0224 10:12:10.638014 4985 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Feb 24 10:12:10 crc kubenswrapper[4985]: I0224 10:12:10.658475 4985 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Feb 24 10:12:10 crc kubenswrapper[4985]: I0224 10:12:10.677313 4985 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Feb 24 10:12:10 crc kubenswrapper[4985]: I0224 10:12:10.686949 4985 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-z6mz6" Feb 24 10:12:10 crc kubenswrapper[4985]: I0224 10:12:10.687752 4985 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/751d5065-e3f3-4fe5-9d24-ca6c7197d0d6-metrics-tls\") pod \"dns-operator-744455d44c-t8w87\" (UID: \"751d5065-e3f3-4fe5-9d24-ca6c7197d0d6\") " pod="openshift-dns-operator/dns-operator-744455d44c-t8w87" Feb 24 10:12:10 crc kubenswrapper[4985]: I0224 10:12:10.697219 4985 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-t527v" Feb 24 10:12:10 crc kubenswrapper[4985]: I0224 10:12:10.700508 4985 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Feb 24 10:12:10 crc kubenswrapper[4985]: I0224 10:12:10.711863 4985 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z7std\" (UniqueName: \"kubernetes.io/projected/e3da134f-45d4-4d12-bc38-87546e0920cd-kube-api-access-z7std\") pod \"ingress-canary-rmswq\" (UID: \"e3da134f-45d4-4d12-bc38-87546e0920cd\") " pod="openshift-ingress-canary/ingress-canary-rmswq" Feb 24 10:12:10 crc kubenswrapper[4985]: I0224 10:12:10.711931 4985 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/78d97920-f891-4f3a-9ccc-b5c10a64e22b-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-qhzl5\" (UID: \"78d97920-f891-4f3a-9ccc-b5c10a64e22b\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-qhzl5" Feb 24 10:12:10 crc kubenswrapper[4985]: I0224 10:12:10.711961 4985 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/2dc233e3-9d26-4ec2-957f-350400af638e-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-prkmg\" (UID: \"2dc233e3-9d26-4ec2-957f-350400af638e\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-prkmg" Feb 24 10:12:10 crc kubenswrapper[4985]: I0224 10:12:10.712012 4985 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/4cc3b979-d591-4106-8874-760925fd10f6-default-certificate\") pod \"router-default-5444994796-gcwfl\" (UID: \"4cc3b979-d591-4106-8874-760925fd10f6\") " pod="openshift-ingress/router-default-5444994796-gcwfl" Feb 24 10:12:10 crc kubenswrapper[4985]: I0224 10:12:10.712029 4985 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lrnms\" (UniqueName: \"kubernetes.io/projected/4cc3b979-d591-4106-8874-760925fd10f6-kube-api-access-lrnms\") pod \"router-default-5444994796-gcwfl\" (UID: \"4cc3b979-d591-4106-8874-760925fd10f6\") " pod="openshift-ingress/router-default-5444994796-gcwfl" Feb 24 10:12:10 crc kubenswrapper[4985]: I0224 10:12:10.712051 4985 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2dc233e3-9d26-4ec2-957f-350400af638e-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-prkmg\" (UID: \"2dc233e3-9d26-4ec2-957f-350400af638e\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-prkmg" Feb 24 10:12:10 crc kubenswrapper[4985]: I0224 10:12:10.712071 4985 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7lfgs\" (UniqueName: \"kubernetes.io/projected/78d97920-f891-4f3a-9ccc-b5c10a64e22b-kube-api-access-7lfgs\") pod \"control-plane-machine-set-operator-78cbb6b69f-qhzl5\" (UID: \"78d97920-f891-4f3a-9ccc-b5c10a64e22b\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-qhzl5" Feb 24 10:12:10 crc kubenswrapper[4985]: I0224 10:12:10.712100 4985 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pnc29\" (UniqueName: \"kubernetes.io/projected/9debae0d-c342-4558-8796-bc9798c297de-kube-api-access-pnc29\") pod \"migrator-59844c95c7-2hxbt\" (UID: \"9debae0d-c342-4558-8796-bc9798c297de\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-2hxbt" Feb 24 10:12:10 crc kubenswrapper[4985]: I0224 10:12:10.712123 4985 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4cc3b979-d591-4106-8874-760925fd10f6-service-ca-bundle\") pod \"router-default-5444994796-gcwfl\" (UID: \"4cc3b979-d591-4106-8874-760925fd10f6\") " pod="openshift-ingress/router-default-5444994796-gcwfl" Feb 24 10:12:10 crc kubenswrapper[4985]: I0224 10:12:10.712151 4985 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/4cc3b979-d591-4106-8874-760925fd10f6-stats-auth\") pod \"router-default-5444994796-gcwfl\" (UID: \"4cc3b979-d591-4106-8874-760925fd10f6\") " pod="openshift-ingress/router-default-5444994796-gcwfl" Feb 24 10:12:10 crc kubenswrapper[4985]: I0224 10:12:10.712170 4985 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e3da134f-45d4-4d12-bc38-87546e0920cd-cert\") pod \"ingress-canary-rmswq\" (UID: \"e3da134f-45d4-4d12-bc38-87546e0920cd\") " pod="openshift-ingress-canary/ingress-canary-rmswq" Feb 24 10:12:10 crc kubenswrapper[4985]: I0224 10:12:10.712187 4985 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4cc3b979-d591-4106-8874-760925fd10f6-metrics-certs\") pod \"router-default-5444994796-gcwfl\" (UID: \"4cc3b979-d591-4106-8874-760925fd10f6\") " pod="openshift-ingress/router-default-5444994796-gcwfl" Feb 24 10:12:10 crc kubenswrapper[4985]: I0224 10:12:10.712205 4985 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2dc233e3-9d26-4ec2-957f-350400af638e-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-prkmg\" (UID: \"2dc233e3-9d26-4ec2-957f-350400af638e\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-prkmg" Feb 24 10:12:10 crc kubenswrapper[4985]: I0224 10:12:10.735842 4985 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Feb 24 10:12:10 crc kubenswrapper[4985]: I0224 10:12:10.755585 4985 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Feb 24 10:12:10 crc kubenswrapper[4985]: I0224 10:12:10.776841 4985 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Feb 24 10:12:10 crc kubenswrapper[4985]: I0224 10:12:10.786990 4985 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/4cc3b979-d591-4106-8874-760925fd10f6-default-certificate\") pod \"router-default-5444994796-gcwfl\" (UID: \"4cc3b979-d591-4106-8874-760925fd10f6\") " pod="openshift-ingress/router-default-5444994796-gcwfl" Feb 24 10:12:10 crc kubenswrapper[4985]: I0224 10:12:10.796775 4985 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Feb 24 10:12:10 crc kubenswrapper[4985]: I0224 10:12:10.806507 4985 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/4cc3b979-d591-4106-8874-760925fd10f6-stats-auth\") pod \"router-default-5444994796-gcwfl\" (UID: \"4cc3b979-d591-4106-8874-760925fd10f6\") " pod="openshift-ingress/router-default-5444994796-gcwfl" Feb 24 10:12:10 crc kubenswrapper[4985]: I0224 10:12:10.815974 4985 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Feb 24 10:12:10 crc kubenswrapper[4985]: I0224 10:12:10.826849 4985 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4cc3b979-d591-4106-8874-760925fd10f6-metrics-certs\") pod \"router-default-5444994796-gcwfl\" (UID: \"4cc3b979-d591-4106-8874-760925fd10f6\") " pod="openshift-ingress/router-default-5444994796-gcwfl" Feb 24 10:12:10 crc kubenswrapper[4985]: I0224 10:12:10.833502 4985 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-nncfz"] Feb 24 10:12:10 crc kubenswrapper[4985]: I0224 10:12:10.836782 4985 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Feb 24 10:12:10 crc kubenswrapper[4985]: I0224 10:12:10.848491 4985 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-t79l8"] Feb 24 10:12:10 crc kubenswrapper[4985]: I0224 10:12:10.849407 4985 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4cc3b979-d591-4106-8874-760925fd10f6-service-ca-bundle\") pod \"router-default-5444994796-gcwfl\" (UID: \"4cc3b979-d591-4106-8874-760925fd10f6\") " pod="openshift-ingress/router-default-5444994796-gcwfl" Feb 24 10:12:10 crc kubenswrapper[4985]: I0224 10:12:10.855337 4985 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Feb 24 10:12:10 crc kubenswrapper[4985]: I0224 10:12:10.878096 4985 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Feb 24 10:12:10 crc kubenswrapper[4985]: I0224 10:12:10.896360 4985 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Feb 24 10:12:10 crc kubenswrapper[4985]: I0224 10:12:10.916475 4985 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Feb 24 10:12:10 crc kubenswrapper[4985]: I0224 10:12:10.934251 4985 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-z6mz6"] Feb 24 10:12:10 crc kubenswrapper[4985]: I0224 10:12:10.935859 4985 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Feb 24 10:12:10 crc kubenswrapper[4985]: I0224 10:12:10.956239 4985 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Feb 24 10:12:10 crc kubenswrapper[4985]: I0224 10:12:10.957596 4985 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-t527v"] Feb 24 10:12:10 crc kubenswrapper[4985]: I0224 10:12:10.976007 4985 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Feb 24 10:12:11 crc kubenswrapper[4985]: I0224 10:12:11.003984 4985 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Feb 24 10:12:11 crc kubenswrapper[4985]: I0224 10:12:11.015651 4985 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Feb 24 10:12:11 crc kubenswrapper[4985]: W0224 10:12:11.023172 4985 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod20592ddc_38f3_4407_a5c3_b719b89cd42e.slice/crio-ff6c482d44e81ef5551d2c16588d60c05c819ec59e8e3e405a92db2b3b95cc78 WatchSource:0}: Error finding container ff6c482d44e81ef5551d2c16588d60c05c819ec59e8e3e405a92db2b3b95cc78: Status 404 returned error can't find the container with id ff6c482d44e81ef5551d2c16588d60c05c819ec59e8e3e405a92db2b3b95cc78 Feb 24 10:12:11 crc kubenswrapper[4985]: I0224 10:12:11.035333 4985 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Feb 24 10:12:11 crc kubenswrapper[4985]: I0224 10:12:11.055947 4985 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Feb 24 10:12:11 crc kubenswrapper[4985]: I0224 10:12:11.068315 4985 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2dc233e3-9d26-4ec2-957f-350400af638e-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-prkmg\" (UID: \"2dc233e3-9d26-4ec2-957f-350400af638e\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-prkmg" Feb 24 10:12:11 crc kubenswrapper[4985]: I0224 10:12:11.074704 4985 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-jlxbx"] Feb 24 10:12:11 crc kubenswrapper[4985]: I0224 10:12:11.079660 4985 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Feb 24 10:12:11 crc kubenswrapper[4985]: I0224 10:12:11.093310 4985 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-d7v4s"] Feb 24 10:12:11 crc kubenswrapper[4985]: I0224 10:12:11.098975 4985 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Feb 24 10:12:11 crc kubenswrapper[4985]: W0224 10:12:11.111764 4985 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8c5e6fc2_42c9_4794_91cc_1f74adf686db.slice/crio-882af0f92fe7d22f97cca77ee3792e1ca0cf3e4d5fad14d249e6f2062bf83793 WatchSource:0}: Error finding container 882af0f92fe7d22f97cca77ee3792e1ca0cf3e4d5fad14d249e6f2062bf83793: Status 404 returned error can't find the container with id 882af0f92fe7d22f97cca77ee3792e1ca0cf3e4d5fad14d249e6f2062bf83793 Feb 24 10:12:11 crc kubenswrapper[4985]: I0224 10:12:11.123274 4985 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Feb 24 10:12:11 crc kubenswrapper[4985]: I0224 10:12:11.135627 4985 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Feb 24 10:12:11 crc kubenswrapper[4985]: I0224 10:12:11.143597 4985 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2dc233e3-9d26-4ec2-957f-350400af638e-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-prkmg\" (UID: \"2dc233e3-9d26-4ec2-957f-350400af638e\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-prkmg" Feb 24 10:12:11 crc kubenswrapper[4985]: I0224 10:12:11.157054 4985 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Feb 24 10:12:11 crc kubenswrapper[4985]: I0224 10:12:11.175448 4985 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Feb 24 10:12:11 crc kubenswrapper[4985]: I0224 10:12:11.196343 4985 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Feb 24 10:12:11 crc kubenswrapper[4985]: I0224 10:12:11.217310 4985 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Feb 24 10:12:11 crc kubenswrapper[4985]: I0224 10:12:11.233431 4985 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-nncfz" event={"ID":"bbf9fdf8-7dd4-4bb2-9ff7-0349600747f8","Type":"ContainerStarted","Data":"48538cdf04104ccaf5c39bd7234a49eddf1558e724666b8500239983bef959df"} Feb 24 10:12:11 crc kubenswrapper[4985]: I0224 10:12:11.233478 4985 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-nncfz" event={"ID":"bbf9fdf8-7dd4-4bb2-9ff7-0349600747f8","Type":"ContainerStarted","Data":"ebf79fb408673c2d0093fd42f3fea35acedeef7df5b0c6bea113c386afda222d"} Feb 24 10:12:11 crc kubenswrapper[4985]: I0224 10:12:11.235017 4985 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-879f6c89f-nncfz" Feb 24 10:12:11 crc kubenswrapper[4985]: I0224 10:12:11.235251 4985 patch_prober.go:28] interesting pod/controller-manager-879f6c89f-nncfz container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.7:8443/healthz\": dial tcp 10.217.0.7:8443: connect: connection refused" start-of-body= Feb 24 10:12:11 crc kubenswrapper[4985]: I0224 10:12:11.235345 4985 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-879f6c89f-nncfz" podUID="bbf9fdf8-7dd4-4bb2-9ff7-0349600747f8" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.7:8443/healthz\": dial tcp 10.217.0.7:8443: connect: connection refused" Feb 24 10:12:11 crc kubenswrapper[4985]: I0224 10:12:11.236479 4985 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-d7v4s" event={"ID":"8c5e6fc2-42c9-4794-91cc-1f74adf686db","Type":"ContainerStarted","Data":"882af0f92fe7d22f97cca77ee3792e1ca0cf3e4d5fad14d249e6f2062bf83793"} Feb 24 10:12:11 crc kubenswrapper[4985]: I0224 10:12:11.236937 4985 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Feb 24 10:12:11 crc kubenswrapper[4985]: I0224 10:12:11.238603 4985 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-z6mz6" event={"ID":"4958ed59-9dc5-4e53-a250-feb5516c6c97","Type":"ContainerStarted","Data":"677849722a34428433507a71fb74ac30963bfb815376a847235f2060052352b7"} Feb 24 10:12:11 crc kubenswrapper[4985]: I0224 10:12:11.240172 4985 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-t79l8" event={"ID":"d98f046c-a925-4946-9571-4aba6008f4b5","Type":"ContainerStarted","Data":"6133f552b384eb8a9d7e959623dd1faf81cb3d3a3c772302bf71215a6d09de91"} Feb 24 10:12:11 crc kubenswrapper[4985]: I0224 10:12:11.240210 4985 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-t79l8" event={"ID":"d98f046c-a925-4946-9571-4aba6008f4b5","Type":"ContainerStarted","Data":"579bc5f910f8acdc4f01f63b99fdfcea0850b7449a4c1e5d295ee54a92285d22"} Feb 24 10:12:11 crc kubenswrapper[4985]: I0224 10:12:11.241835 4985 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-t527v" event={"ID":"20592ddc-38f3-4407-a5c3-b719b89cd42e","Type":"ContainerStarted","Data":"e79df61ee8416172c627e78c1fe537792486444ac107125c0b8cdf78710fecf7"} Feb 24 10:12:11 crc kubenswrapper[4985]: I0224 10:12:11.241883 4985 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-t527v" event={"ID":"20592ddc-38f3-4407-a5c3-b719b89cd42e","Type":"ContainerStarted","Data":"ff6c482d44e81ef5551d2c16588d60c05c819ec59e8e3e405a92db2b3b95cc78"} Feb 24 10:12:11 crc kubenswrapper[4985]: I0224 10:12:11.242297 4985 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-t527v" Feb 24 10:12:11 crc kubenswrapper[4985]: I0224 10:12:11.242829 4985 patch_prober.go:28] interesting pod/route-controller-manager-6576b87f9c-t527v container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.16:8443/healthz\": dial tcp 10.217.0.16:8443: connect: connection refused" start-of-body= Feb 24 10:12:11 crc kubenswrapper[4985]: I0224 10:12:11.242868 4985 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-t527v" podUID="20592ddc-38f3-4407-a5c3-b719b89cd42e" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.16:8443/healthz\": dial tcp 10.217.0.16:8443: connect: connection refused" Feb 24 10:12:11 crc kubenswrapper[4985]: I0224 10:12:11.243381 4985 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-jlxbx" event={"ID":"d93e05b9-4ed8-4cd6-a961-25cf43ed4cf9","Type":"ContainerStarted","Data":"b5755e17b84a90725fde4eceda61c2ec11dcb7889e015f778f88f8f92b3b8713"} Feb 24 10:12:11 crc kubenswrapper[4985]: I0224 10:12:11.258392 4985 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Feb 24 10:12:11 crc kubenswrapper[4985]: I0224 10:12:11.276210 4985 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Feb 24 10:12:11 crc kubenswrapper[4985]: I0224 10:12:11.295875 4985 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Feb 24 10:12:11 crc kubenswrapper[4985]: I0224 10:12:11.315724 4985 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Feb 24 10:12:11 crc kubenswrapper[4985]: I0224 10:12:11.336564 4985 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Feb 24 10:12:11 crc kubenswrapper[4985]: I0224 10:12:11.356534 4985 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Feb 24 10:12:11 crc kubenswrapper[4985]: I0224 10:12:11.376491 4985 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Feb 24 10:12:11 crc kubenswrapper[4985]: I0224 10:12:11.414336 4985 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r2mfz\" (UniqueName: \"kubernetes.io/projected/efdfb8c2-12c0-4134-a05d-f7c880e70649-kube-api-access-r2mfz\") pod \"authentication-operator-69f744f599-bl6z5\" (UID: \"efdfb8c2-12c0-4134-a05d-f7c880e70649\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-bl6z5" Feb 24 10:12:11 crc kubenswrapper[4985]: I0224 10:12:11.431155 4985 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8mtzk\" (UniqueName: \"kubernetes.io/projected/8d200ef1-f292-460a-a971-1ac47d688fb1-kube-api-access-8mtzk\") pod \"console-operator-58897d9998-v8tln\" (UID: \"8d200ef1-f292-460a-a971-1ac47d688fb1\") " pod="openshift-console-operator/console-operator-58897d9998-v8tln" Feb 24 10:12:11 crc kubenswrapper[4985]: I0224 10:12:11.452865 4985 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-trlvj\" (UniqueName: \"kubernetes.io/projected/d8dbe690-7c45-4846-9f75-6ec3cb4bc9ba-kube-api-access-trlvj\") pod \"machine-approver-56656f9798-sbwld\" (UID: \"d8dbe690-7c45-4846-9f75-6ec3cb4bc9ba\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-sbwld" Feb 24 10:12:11 crc kubenswrapper[4985]: I0224 10:12:11.460043 4985 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-bl6z5" Feb 24 10:12:11 crc kubenswrapper[4985]: I0224 10:12:11.474333 4985 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qvdrz\" (UniqueName: \"kubernetes.io/projected/5dba6bb5-c7aa-45b0-82b4-990febac8e99-kube-api-access-qvdrz\") pod \"openshift-config-operator-7777fb866f-n8mff\" (UID: \"5dba6bb5-c7aa-45b0-82b4-990febac8e99\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-n8mff" Feb 24 10:12:11 crc kubenswrapper[4985]: I0224 10:12:11.476329 4985 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Feb 24 10:12:11 crc kubenswrapper[4985]: I0224 10:12:11.479926 4985 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-sbwld" Feb 24 10:12:11 crc kubenswrapper[4985]: I0224 10:12:11.487947 4985 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/78d97920-f891-4f3a-9ccc-b5c10a64e22b-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-qhzl5\" (UID: \"78d97920-f891-4f3a-9ccc-b5c10a64e22b\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-qhzl5" Feb 24 10:12:11 crc kubenswrapper[4985]: I0224 10:12:11.496313 4985 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Feb 24 10:12:11 crc kubenswrapper[4985]: W0224 10:12:11.512467 4985 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd8dbe690_7c45_4846_9f75_6ec3cb4bc9ba.slice/crio-b9c6a9a19757cfc1913c9f4e4480c491f76133518bda4095edeab5aefa595843 WatchSource:0}: Error finding container b9c6a9a19757cfc1913c9f4e4480c491f76133518bda4095edeab5aefa595843: Status 404 returned error can't find the container with id b9c6a9a19757cfc1913c9f4e4480c491f76133518bda4095edeab5aefa595843 Feb 24 10:12:11 crc kubenswrapper[4985]: I0224 10:12:11.530672 4985 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4vnbg\" (UniqueName: \"kubernetes.io/projected/175055a6-4d1f-47c1-8947-287934749c2b-kube-api-access-4vnbg\") pod \"openshift-controller-manager-operator-756b6f6bc6-cs28d\" (UID: \"175055a6-4d1f-47c1-8947-287934749c2b\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-cs28d" Feb 24 10:12:11 crc kubenswrapper[4985]: I0224 10:12:11.534450 4985 request.go:700] Waited for 1.014819529s due to client-side throttling, not priority and fairness, request: POST:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/serviceaccounts/oauth-openshift/token Feb 24 10:12:11 crc kubenswrapper[4985]: I0224 10:12:11.555882 4985 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Feb 24 10:12:11 crc kubenswrapper[4985]: I0224 10:12:11.556740 4985 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qss5w\" (UniqueName: \"kubernetes.io/projected/ee57ec8e-3901-4355-b744-5ed2eeb20c9d-kube-api-access-qss5w\") pod \"oauth-openshift-558db77b4-xh5q8\" (UID: \"ee57ec8e-3901-4355-b744-5ed2eeb20c9d\") " pod="openshift-authentication/oauth-openshift-558db77b4-xh5q8" Feb 24 10:12:11 crc kubenswrapper[4985]: I0224 10:12:11.557812 4985 kubelet_pods.go:1007] "Unable to retrieve pull secret, the image pull may not succeed." pod="openshift-apiserver/apiserver-76f77b778f-nnr7f" secret="" err="failed to sync secret cache: timed out waiting for the condition" Feb 24 10:12:11 crc kubenswrapper[4985]: I0224 10:12:11.557902 4985 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-nnr7f" Feb 24 10:12:11 crc kubenswrapper[4985]: I0224 10:12:11.577097 4985 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Feb 24 10:12:11 crc kubenswrapper[4985]: I0224 10:12:11.596534 4985 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Feb 24 10:12:11 crc kubenswrapper[4985]: I0224 10:12:11.616588 4985 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Feb 24 10:12:11 crc kubenswrapper[4985]: I0224 10:12:11.637132 4985 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Feb 24 10:12:11 crc kubenswrapper[4985]: I0224 10:12:11.659057 4985 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Feb 24 10:12:11 crc kubenswrapper[4985]: I0224 10:12:11.659793 4985 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-bl6z5"] Feb 24 10:12:11 crc kubenswrapper[4985]: I0224 10:12:11.675863 4985 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Feb 24 10:12:11 crc kubenswrapper[4985]: I0224 10:12:11.690748 4985 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-v8tln" Feb 24 10:12:11 crc kubenswrapper[4985]: I0224 10:12:11.696592 4985 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Feb 24 10:12:11 crc kubenswrapper[4985]: E0224 10:12:11.713654 4985 secret.go:188] Couldn't get secret openshift-ingress-canary/canary-serving-cert: failed to sync secret cache: timed out waiting for the condition Feb 24 10:12:11 crc kubenswrapper[4985]: E0224 10:12:11.713746 4985 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e3da134f-45d4-4d12-bc38-87546e0920cd-cert podName:e3da134f-45d4-4d12-bc38-87546e0920cd nodeName:}" failed. No retries permitted until 2026-02-24 10:12:12.213724386 +0000 UTC m=+216.687916946 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/e3da134f-45d4-4d12-bc38-87546e0920cd-cert") pod "ingress-canary-rmswq" (UID: "e3da134f-45d4-4d12-bc38-87546e0920cd") : failed to sync secret cache: timed out waiting for the condition Feb 24 10:12:11 crc kubenswrapper[4985]: I0224 10:12:11.723472 4985 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Feb 24 10:12:11 crc kubenswrapper[4985]: I0224 10:12:11.732715 4985 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-xh5q8" Feb 24 10:12:11 crc kubenswrapper[4985]: I0224 10:12:11.737164 4985 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Feb 24 10:12:11 crc kubenswrapper[4985]: I0224 10:12:11.751985 4985 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-n8mff" Feb 24 10:12:11 crc kubenswrapper[4985]: I0224 10:12:11.756189 4985 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Feb 24 10:12:11 crc kubenswrapper[4985]: I0224 10:12:11.764096 4985 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-nnr7f"] Feb 24 10:12:11 crc kubenswrapper[4985]: I0224 10:12:11.775213 4985 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Feb 24 10:12:11 crc kubenswrapper[4985]: I0224 10:12:11.781117 4985 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-cs28d" Feb 24 10:12:11 crc kubenswrapper[4985]: I0224 10:12:11.798661 4985 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Feb 24 10:12:11 crc kubenswrapper[4985]: I0224 10:12:11.820381 4985 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Feb 24 10:12:11 crc kubenswrapper[4985]: I0224 10:12:11.836072 4985 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Feb 24 10:12:11 crc kubenswrapper[4985]: I0224 10:12:11.862562 4985 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Feb 24 10:12:11 crc kubenswrapper[4985]: I0224 10:12:11.875968 4985 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Feb 24 10:12:11 crc kubenswrapper[4985]: I0224 10:12:11.901435 4985 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Feb 24 10:12:11 crc kubenswrapper[4985]: I0224 10:12:11.917662 4985 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Feb 24 10:12:11 crc kubenswrapper[4985]: I0224 10:12:11.929359 4985 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-xh5q8"] Feb 24 10:12:11 crc kubenswrapper[4985]: I0224 10:12:11.938555 4985 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Feb 24 10:12:11 crc kubenswrapper[4985]: I0224 10:12:11.956472 4985 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Feb 24 10:12:11 crc kubenswrapper[4985]: I0224 10:12:11.976287 4985 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Feb 24 10:12:11 crc kubenswrapper[4985]: I0224 10:12:11.978649 4985 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-v8tln"] Feb 24 10:12:11 crc kubenswrapper[4985]: I0224 10:12:11.998080 4985 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Feb 24 10:12:12 crc kubenswrapper[4985]: I0224 10:12:12.015984 4985 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Feb 24 10:12:12 crc kubenswrapper[4985]: I0224 10:12:12.034872 4985 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Feb 24 10:12:12 crc kubenswrapper[4985]: I0224 10:12:12.055130 4985 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Feb 24 10:12:12 crc kubenswrapper[4985]: I0224 10:12:12.085102 4985 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Feb 24 10:12:12 crc kubenswrapper[4985]: I0224 10:12:12.096632 4985 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Feb 24 10:12:12 crc kubenswrapper[4985]: I0224 10:12:12.115898 4985 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Feb 24 10:12:12 crc kubenswrapper[4985]: I0224 10:12:12.138194 4985 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Feb 24 10:12:12 crc kubenswrapper[4985]: I0224 10:12:12.154870 4985 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Feb 24 10:12:12 crc kubenswrapper[4985]: I0224 10:12:12.176584 4985 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Feb 24 10:12:12 crc kubenswrapper[4985]: I0224 10:12:12.195529 4985 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Feb 24 10:12:12 crc kubenswrapper[4985]: I0224 10:12:12.215288 4985 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Feb 24 10:12:12 crc kubenswrapper[4985]: I0224 10:12:12.237193 4985 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Feb 24 10:12:12 crc kubenswrapper[4985]: I0224 10:12:12.238772 4985 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e3da134f-45d4-4d12-bc38-87546e0920cd-cert\") pod \"ingress-canary-rmswq\" (UID: \"e3da134f-45d4-4d12-bc38-87546e0920cd\") " pod="openshift-ingress-canary/ingress-canary-rmswq" Feb 24 10:12:12 crc kubenswrapper[4985]: I0224 10:12:12.248384 4985 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-n8mff"] Feb 24 10:12:12 crc kubenswrapper[4985]: I0224 10:12:12.248440 4985 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-cs28d"] Feb 24 10:12:12 crc kubenswrapper[4985]: I0224 10:12:12.256012 4985 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Feb 24 10:12:12 crc kubenswrapper[4985]: I0224 10:12:12.256073 4985 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-t79l8" event={"ID":"d98f046c-a925-4946-9571-4aba6008f4b5","Type":"ContainerStarted","Data":"334d1250af13ed4e2ad7da8d9def65215e12a0c7a277bcc29e5d8eae50b4c887"} Feb 24 10:12:12 crc kubenswrapper[4985]: I0224 10:12:12.257860 4985 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-d7v4s" event={"ID":"8c5e6fc2-42c9-4794-91cc-1f74adf686db","Type":"ContainerStarted","Data":"41953cb0133fccb6fe75e8ebd4680bb0fd266a7cf52aa23c248257e65ea56699"} Feb 24 10:12:12 crc kubenswrapper[4985]: I0224 10:12:12.269784 4985 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-z6mz6" event={"ID":"4958ed59-9dc5-4e53-a250-feb5516c6c97","Type":"ContainerStarted","Data":"addcba7a585e941aa569a0a29f7975599c22772ca6ff9f0c175cec64e56d8cc5"} Feb 24 10:12:12 crc kubenswrapper[4985]: I0224 10:12:12.269825 4985 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-z6mz6" event={"ID":"4958ed59-9dc5-4e53-a250-feb5516c6c97","Type":"ContainerStarted","Data":"5e5057c94e33e7dc2f615c83a76eedd373d94b66d2c07c3e28f67124e6b87355"} Feb 24 10:12:12 crc kubenswrapper[4985]: I0224 10:12:12.272506 4985 generic.go:334] "Generic (PLEG): container finished" podID="d93e05b9-4ed8-4cd6-a961-25cf43ed4cf9" containerID="ba11f145a4551a1020f33d068b9fdbdc5b41e5c1e1ad17343dd03fc661c51c8e" exitCode=0 Feb 24 10:12:12 crc kubenswrapper[4985]: I0224 10:12:12.272704 4985 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-jlxbx" event={"ID":"d93e05b9-4ed8-4cd6-a961-25cf43ed4cf9","Type":"ContainerDied","Data":"ba11f145a4551a1020f33d068b9fdbdc5b41e5c1e1ad17343dd03fc661c51c8e"} Feb 24 10:12:12 crc kubenswrapper[4985]: I0224 10:12:12.274014 4985 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-xh5q8" event={"ID":"ee57ec8e-3901-4355-b744-5ed2eeb20c9d","Type":"ContainerStarted","Data":"e5ee134f2567dda87c8478f23813f6e237d4049cb0519e1d99dd910b58b687bf"} Feb 24 10:12:12 crc kubenswrapper[4985]: I0224 10:12:12.275423 4985 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Feb 24 10:12:12 crc kubenswrapper[4985]: I0224 10:12:12.281148 4985 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-bl6z5" event={"ID":"efdfb8c2-12c0-4134-a05d-f7c880e70649","Type":"ContainerStarted","Data":"55a8fb7821843cb71ad35edfe6e6c9e1a44ddb3ce4d18eb034129bc70fa02766"} Feb 24 10:12:12 crc kubenswrapper[4985]: I0224 10:12:12.281233 4985 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-bl6z5" event={"ID":"efdfb8c2-12c0-4134-a05d-f7c880e70649","Type":"ContainerStarted","Data":"88325f6618304a59fdc7164277bbbbada5cb9715426956403eb3e644bb2af99d"} Feb 24 10:12:12 crc kubenswrapper[4985]: I0224 10:12:12.287855 4985 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-v8tln" event={"ID":"8d200ef1-f292-460a-a971-1ac47d688fb1","Type":"ContainerStarted","Data":"cd56f268411197af2c747dbd8a07b2ebca2470dfa25d16dc9216ee121ad70b4f"} Feb 24 10:12:12 crc kubenswrapper[4985]: I0224 10:12:12.287951 4985 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-v8tln" event={"ID":"8d200ef1-f292-460a-a971-1ac47d688fb1","Type":"ContainerStarted","Data":"dc3bc17f26d104c3b16021b8733c7a5b8842116f3e84b95a20a19ff6c20a603d"} Feb 24 10:12:12 crc kubenswrapper[4985]: I0224 10:12:12.290071 4985 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console-operator/console-operator-58897d9998-v8tln" Feb 24 10:12:12 crc kubenswrapper[4985]: I0224 10:12:12.290959 4985 patch_prober.go:28] interesting pod/console-operator-58897d9998-v8tln container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.10:8443/readyz\": dial tcp 10.217.0.10:8443: connect: connection refused" start-of-body= Feb 24 10:12:12 crc kubenswrapper[4985]: I0224 10:12:12.291018 4985 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-v8tln" podUID="8d200ef1-f292-460a-a971-1ac47d688fb1" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.10:8443/readyz\": dial tcp 10.217.0.10:8443: connect: connection refused" Feb 24 10:12:12 crc kubenswrapper[4985]: I0224 10:12:12.297200 4985 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Feb 24 10:12:12 crc kubenswrapper[4985]: I0224 10:12:12.297395 4985 generic.go:334] "Generic (PLEG): container finished" podID="8905bada-0871-41b5-9a21-83e4d3884edf" containerID="bad4fad3b9ef1d7846ea7f64428e2b604b8829df1877e6d559e1c70281b029ac" exitCode=0 Feb 24 10:12:12 crc kubenswrapper[4985]: I0224 10:12:12.297515 4985 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-nnr7f" event={"ID":"8905bada-0871-41b5-9a21-83e4d3884edf","Type":"ContainerDied","Data":"bad4fad3b9ef1d7846ea7f64428e2b604b8829df1877e6d559e1c70281b029ac"} Feb 24 10:12:12 crc kubenswrapper[4985]: I0224 10:12:12.297550 4985 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-nnr7f" event={"ID":"8905bada-0871-41b5-9a21-83e4d3884edf","Type":"ContainerStarted","Data":"b23469643b1cba06651b3aa36af25ff845bf55c37124fc589b199e62d8e1566b"} Feb 24 10:12:12 crc kubenswrapper[4985]: I0224 10:12:12.299235 4985 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-sbwld" event={"ID":"d8dbe690-7c45-4846-9f75-6ec3cb4bc9ba","Type":"ContainerStarted","Data":"edce594be7794f36b8efaf3f448b946a75a80eac149724c5d662e6d95d2b2c59"} Feb 24 10:12:12 crc kubenswrapper[4985]: I0224 10:12:12.299263 4985 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-sbwld" event={"ID":"d8dbe690-7c45-4846-9f75-6ec3cb4bc9ba","Type":"ContainerStarted","Data":"43f37d750293c52d875fc7907d693fece5bd45d3dc5a3629e36197220e7c1a78"} Feb 24 10:12:12 crc kubenswrapper[4985]: I0224 10:12:12.299279 4985 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-sbwld" event={"ID":"d8dbe690-7c45-4846-9f75-6ec3cb4bc9ba","Type":"ContainerStarted","Data":"b9c6a9a19757cfc1913c9f4e4480c491f76133518bda4095edeab5aefa595843"} Feb 24 10:12:12 crc kubenswrapper[4985]: I0224 10:12:12.303458 4985 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-t527v" Feb 24 10:12:12 crc kubenswrapper[4985]: I0224 10:12:12.310124 4985 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-879f6c89f-nncfz" Feb 24 10:12:12 crc kubenswrapper[4985]: I0224 10:12:12.317215 4985 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Feb 24 10:12:12 crc kubenswrapper[4985]: I0224 10:12:12.341181 4985 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Feb 24 10:12:12 crc kubenswrapper[4985]: I0224 10:12:12.356420 4985 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Feb 24 10:12:12 crc kubenswrapper[4985]: I0224 10:12:12.385319 4985 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e3da134f-45d4-4d12-bc38-87546e0920cd-cert\") pod \"ingress-canary-rmswq\" (UID: \"e3da134f-45d4-4d12-bc38-87546e0920cd\") " pod="openshift-ingress-canary/ingress-canary-rmswq" Feb 24 10:12:12 crc kubenswrapper[4985]: I0224 10:12:12.397485 4985 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Feb 24 10:12:12 crc kubenswrapper[4985]: I0224 10:12:12.417031 4985 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Feb 24 10:12:12 crc kubenswrapper[4985]: I0224 10:12:12.436077 4985 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Feb 24 10:12:12 crc kubenswrapper[4985]: I0224 10:12:12.476956 4985 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-46nsb\" (UniqueName: \"kubernetes.io/projected/db4512d4-0e5a-467b-85da-2eb3addffa4f-kube-api-access-46nsb\") pod \"openshift-apiserver-operator-796bbdcf4f-ktpsr\" (UID: \"db4512d4-0e5a-467b-85da-2eb3addffa4f\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-ktpsr" Feb 24 10:12:12 crc kubenswrapper[4985]: I0224 10:12:12.512513 4985 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d8hbp\" (UniqueName: \"kubernetes.io/projected/cab7ac62-05fe-4c59-90f8-7ca1297d2cd0-kube-api-access-d8hbp\") pod \"etcd-operator-b45778765-jhlml\" (UID: \"cab7ac62-05fe-4c59-90f8-7ca1297d2cd0\") " pod="openshift-etcd-operator/etcd-operator-b45778765-jhlml" Feb 24 10:12:12 crc kubenswrapper[4985]: I0224 10:12:12.524831 4985 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vhb6t\" (UniqueName: \"kubernetes.io/projected/bce7d3e8-b855-43ae-8527-fbc14ac50521-kube-api-access-vhb6t\") pod \"downloads-7954f5f757-zjjx8\" (UID: \"bce7d3e8-b855-43ae-8527-fbc14ac50521\") " pod="openshift-console/downloads-7954f5f757-zjjx8" Feb 24 10:12:12 crc kubenswrapper[4985]: I0224 10:12:12.533713 4985 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jbd8d\" (UniqueName: \"kubernetes.io/projected/751d5065-e3f3-4fe5-9d24-ca6c7197d0d6-kube-api-access-jbd8d\") pod \"dns-operator-744455d44c-t8w87\" (UID: \"751d5065-e3f3-4fe5-9d24-ca6c7197d0d6\") " pod="openshift-dns-operator/dns-operator-744455d44c-t8w87" Feb 24 10:12:12 crc kubenswrapper[4985]: I0224 10:12:12.537434 4985 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Feb 24 10:12:12 crc kubenswrapper[4985]: I0224 10:12:12.554289 4985 request.go:700] Waited for 1.930611129s due to client-side throttling, not priority and fairness, request: GET:https://api-int.crc.testing:6443/api/v1/namespaces/hostpath-provisioner/configmaps?fieldSelector=metadata.name%3Dopenshift-service-ca.crt&limit=500&resourceVersion=0 Feb 24 10:12:12 crc kubenswrapper[4985]: I0224 10:12:12.555807 4985 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Feb 24 10:12:12 crc kubenswrapper[4985]: I0224 10:12:12.577216 4985 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Feb 24 10:12:12 crc kubenswrapper[4985]: I0224 10:12:12.595953 4985 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Feb 24 10:12:12 crc kubenswrapper[4985]: I0224 10:12:12.616357 4985 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Feb 24 10:12:12 crc kubenswrapper[4985]: I0224 10:12:12.635446 4985 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Feb 24 10:12:12 crc kubenswrapper[4985]: I0224 10:12:12.674625 4985 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z7std\" (UniqueName: \"kubernetes.io/projected/e3da134f-45d4-4d12-bc38-87546e0920cd-kube-api-access-z7std\") pod \"ingress-canary-rmswq\" (UID: \"e3da134f-45d4-4d12-bc38-87546e0920cd\") " pod="openshift-ingress-canary/ingress-canary-rmswq" Feb 24 10:12:12 crc kubenswrapper[4985]: I0224 10:12:12.691173 4985 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-jhlml" Feb 24 10:12:12 crc kubenswrapper[4985]: I0224 10:12:12.696018 4985 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/2dc233e3-9d26-4ec2-957f-350400af638e-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-prkmg\" (UID: \"2dc233e3-9d26-4ec2-957f-350400af638e\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-prkmg" Feb 24 10:12:12 crc kubenswrapper[4985]: I0224 10:12:12.697790 4985 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-ktpsr" Feb 24 10:12:12 crc kubenswrapper[4985]: I0224 10:12:12.710456 4985 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-zjjx8" Feb 24 10:12:12 crc kubenswrapper[4985]: I0224 10:12:12.715569 4985 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-t8w87" Feb 24 10:12:12 crc kubenswrapper[4985]: I0224 10:12:12.717093 4985 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lrnms\" (UniqueName: \"kubernetes.io/projected/4cc3b979-d591-4106-8874-760925fd10f6-kube-api-access-lrnms\") pod \"router-default-5444994796-gcwfl\" (UID: \"4cc3b979-d591-4106-8874-760925fd10f6\") " pod="openshift-ingress/router-default-5444994796-gcwfl" Feb 24 10:12:12 crc kubenswrapper[4985]: I0224 10:12:12.734095 4985 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-gcwfl" Feb 24 10:12:12 crc kubenswrapper[4985]: I0224 10:12:12.741790 4985 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7lfgs\" (UniqueName: \"kubernetes.io/projected/78d97920-f891-4f3a-9ccc-b5c10a64e22b-kube-api-access-7lfgs\") pod \"control-plane-machine-set-operator-78cbb6b69f-qhzl5\" (UID: \"78d97920-f891-4f3a-9ccc-b5c10a64e22b\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-qhzl5" Feb 24 10:12:12 crc kubenswrapper[4985]: I0224 10:12:12.751686 4985 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-prkmg" Feb 24 10:12:12 crc kubenswrapper[4985]: I0224 10:12:12.767532 4985 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pnc29\" (UniqueName: \"kubernetes.io/projected/9debae0d-c342-4558-8796-bc9798c297de-kube-api-access-pnc29\") pod \"migrator-59844c95c7-2hxbt\" (UID: \"9debae0d-c342-4558-8796-bc9798c297de\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-2hxbt" Feb 24 10:12:12 crc kubenswrapper[4985]: I0224 10:12:12.793469 4985 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Feb 24 10:12:12 crc kubenswrapper[4985]: I0224 10:12:12.798119 4985 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-2hxbt" Feb 24 10:12:12 crc kubenswrapper[4985]: I0224 10:12:12.838172 4985 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-qhzl5" Feb 24 10:12:12 crc kubenswrapper[4985]: I0224 10:12:12.844212 4985 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/122aaa41-07ff-451d-a71f-b343d6993525-auth-proxy-config\") pod \"machine-config-operator-74547568cd-pswjd\" (UID: \"122aaa41-07ff-451d-a71f-b343d6993525\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-pswjd" Feb 24 10:12:12 crc kubenswrapper[4985]: I0224 10:12:12.844241 4985 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/6ef771ef-28ac-46c6-925e-f12a7a70b6c3-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-rlq9j\" (UID: \"6ef771ef-28ac-46c6-925e-f12a7a70b6c3\") " pod="openshift-marketplace/marketplace-operator-79b997595-rlq9j" Feb 24 10:12:12 crc kubenswrapper[4985]: I0224 10:12:12.844268 4985 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/6d51aec2-2b30-49d0-8aea-019bea882940-bound-sa-token\") pod \"image-registry-697d97f7c8-ggvx6\" (UID: \"6d51aec2-2b30-49d0-8aea-019bea882940\") " pod="openshift-image-registry/image-registry-697d97f7c8-ggvx6" Feb 24 10:12:12 crc kubenswrapper[4985]: I0224 10:12:12.844284 4985 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/b85b9142-2d6e-47e9-8d88-bed8bbaa6df7-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-25d4p\" (UID: \"b85b9142-2d6e-47e9-8d88-bed8bbaa6df7\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-25d4p" Feb 24 10:12:12 crc kubenswrapper[4985]: I0224 10:12:12.844301 4985 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/ad42793f-f18e-4948-95fe-84c7b2edcf9e-signing-key\") pod \"service-ca-9c57cc56f-txrfz\" (UID: \"ad42793f-f18e-4948-95fe-84c7b2edcf9e\") " pod="openshift-service-ca/service-ca-9c57cc56f-txrfz" Feb 24 10:12:12 crc kubenswrapper[4985]: I0224 10:12:12.844326 4985 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/2364f27b-fc75-4e56-8122-d6bdcc763b0a-secret-volume\") pod \"collect-profiles-29532120-q5v4c\" (UID: \"2364f27b-fc75-4e56-8122-d6bdcc763b0a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29532120-q5v4c" Feb 24 10:12:12 crc kubenswrapper[4985]: I0224 10:12:12.844342 4985 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9t66n\" (UniqueName: \"kubernetes.io/projected/deefacd4-ae02-434c-bb74-2ac6b217e0e2-kube-api-access-9t66n\") pod \"ingress-operator-5b745b69d9-lwqzt\" (UID: \"deefacd4-ae02-434c-bb74-2ac6b217e0e2\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-lwqzt" Feb 24 10:12:12 crc kubenswrapper[4985]: I0224 10:12:12.844369 4985 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b6a6001a-7b27-4dc4-a1c5-5bbaa449dfe0-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-2xfms\" (UID: \"b6a6001a-7b27-4dc4-a1c5-5bbaa449dfe0\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-2xfms" Feb 24 10:12:12 crc kubenswrapper[4985]: I0224 10:12:12.844391 4985 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/6d51aec2-2b30-49d0-8aea-019bea882940-registry-tls\") pod \"image-registry-697d97f7c8-ggvx6\" (UID: \"6d51aec2-2b30-49d0-8aea-019bea882940\") " pod="openshift-image-registry/image-registry-697d97f7c8-ggvx6" Feb 24 10:12:12 crc kubenswrapper[4985]: I0224 10:12:12.844404 4985 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/deefacd4-ae02-434c-bb74-2ac6b217e0e2-metrics-tls\") pod \"ingress-operator-5b745b69d9-lwqzt\" (UID: \"deefacd4-ae02-434c-bb74-2ac6b217e0e2\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-lwqzt" Feb 24 10:12:12 crc kubenswrapper[4985]: I0224 10:12:12.844434 4985 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f9zlg\" (UniqueName: \"kubernetes.io/projected/34aea390-2b22-43e4-aabd-8c3d13390620-kube-api-access-f9zlg\") pod \"machine-config-controller-84d6567774-ww7q7\" (UID: \"34aea390-2b22-43e4-aabd-8c3d13390620\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-ww7q7" Feb 24 10:12:12 crc kubenswrapper[4985]: I0224 10:12:12.844450 4985 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rsmd2\" (UniqueName: \"kubernetes.io/projected/db31f9f1-19fc-47b3-8570-5428789afce0-kube-api-access-rsmd2\") pod \"multus-admission-controller-857f4d67dd-mlncn\" (UID: \"db31f9f1-19fc-47b3-8570-5428789afce0\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-mlncn" Feb 24 10:12:12 crc kubenswrapper[4985]: I0224 10:12:12.844469 4985 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/ad42793f-f18e-4948-95fe-84c7b2edcf9e-signing-cabundle\") pod \"service-ca-9c57cc56f-txrfz\" (UID: \"ad42793f-f18e-4948-95fe-84c7b2edcf9e\") " pod="openshift-service-ca/service-ca-9c57cc56f-txrfz" Feb 24 10:12:12 crc kubenswrapper[4985]: I0224 10:12:12.844484 4985 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/da44f74b-f631-45cb-a8ad-78e36e49d2f1-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-4hthm\" (UID: \"da44f74b-f631-45cb-a8ad-78e36e49d2f1\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-4hthm" Feb 24 10:12:12 crc kubenswrapper[4985]: I0224 10:12:12.844504 4985 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/c2ddf07b-fdd4-4d99-a09c-256ed526960f-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-2hv9t\" (UID: \"c2ddf07b-fdd4-4d99-a09c-256ed526960f\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-2hv9t" Feb 24 10:12:12 crc kubenswrapper[4985]: I0224 10:12:12.844526 4985 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/492804d6-fccf-4710-b64f-aef85fa80b2d-metrics-tls\") pod \"dns-default-w5s2c\" (UID: \"492804d6-fccf-4710-b64f-aef85fa80b2d\") " pod="openshift-dns/dns-default-w5s2c" Feb 24 10:12:12 crc kubenswrapper[4985]: I0224 10:12:12.844562 4985 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5t4s9\" (UniqueName: \"kubernetes.io/projected/b6a6001a-7b27-4dc4-a1c5-5bbaa449dfe0-kube-api-access-5t4s9\") pod \"kube-storage-version-migrator-operator-b67b599dd-2xfms\" (UID: \"b6a6001a-7b27-4dc4-a1c5-5bbaa449dfe0\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-2xfms" Feb 24 10:12:12 crc kubenswrapper[4985]: I0224 10:12:12.844585 4985 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/deefacd4-ae02-434c-bb74-2ac6b217e0e2-bound-sa-token\") pod \"ingress-operator-5b745b69d9-lwqzt\" (UID: \"deefacd4-ae02-434c-bb74-2ac6b217e0e2\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-lwqzt" Feb 24 10:12:12 crc kubenswrapper[4985]: I0224 10:12:12.844605 4985 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/139ca41b-7c70-4db2-a9bf-a5495fdeadbe-config\") pod \"service-ca-operator-777779d784-225jm\" (UID: \"139ca41b-7c70-4db2-a9bf-a5495fdeadbe\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-225jm" Feb 24 10:12:12 crc kubenswrapper[4985]: I0224 10:12:12.844641 4985 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/122aaa41-07ff-451d-a71f-b343d6993525-images\") pod \"machine-config-operator-74547568cd-pswjd\" (UID: \"122aaa41-07ff-451d-a71f-b343d6993525\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-pswjd" Feb 24 10:12:12 crc kubenswrapper[4985]: I0224 10:12:12.844721 4985 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rxnmj\" (UniqueName: \"kubernetes.io/projected/6d51aec2-2b30-49d0-8aea-019bea882940-kube-api-access-rxnmj\") pod \"image-registry-697d97f7c8-ggvx6\" (UID: \"6d51aec2-2b30-49d0-8aea-019bea882940\") " pod="openshift-image-registry/image-registry-697d97f7c8-ggvx6" Feb 24 10:12:12 crc kubenswrapper[4985]: I0224 10:12:12.844762 4985 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/139ca41b-7c70-4db2-a9bf-a5495fdeadbe-serving-cert\") pod \"service-ca-operator-777779d784-225jm\" (UID: \"139ca41b-7c70-4db2-a9bf-a5495fdeadbe\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-225jm" Feb 24 10:12:12 crc kubenswrapper[4985]: I0224 10:12:12.844812 4985 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/4389db90-c320-4227-b4cc-efc14354ac37-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-hfkq2\" (UID: \"4389db90-c320-4227-b4cc-efc14354ac37\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-hfkq2" Feb 24 10:12:12 crc kubenswrapper[4985]: I0224 10:12:12.844831 4985 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/45e61914-7807-43c3-b80b-a5b355b7447e-srv-cert\") pod \"catalog-operator-68c6474976-2vcf4\" (UID: \"45e61914-7807-43c3-b80b-a5b355b7447e\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-2vcf4" Feb 24 10:12:12 crc kubenswrapper[4985]: I0224 10:12:12.844850 4985 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c2ddf07b-fdd4-4d99-a09c-256ed526960f-config\") pod \"kube-apiserver-operator-766d6c64bb-2hv9t\" (UID: \"c2ddf07b-fdd4-4d99-a09c-256ed526960f\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-2hv9t" Feb 24 10:12:12 crc kubenswrapper[4985]: I0224 10:12:12.844880 4985 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/1ef1d68d-0ae5-4c63-b7b3-40e39ad30a87-srv-cert\") pod \"olm-operator-6b444d44fb-mvvxj\" (UID: \"1ef1d68d-0ae5-4c63-b7b3-40e39ad30a87\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-mvvxj" Feb 24 10:12:12 crc kubenswrapper[4985]: I0224 10:12:12.844898 4985 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/6d51aec2-2b30-49d0-8aea-019bea882940-registry-certificates\") pod \"image-registry-697d97f7c8-ggvx6\" (UID: \"6d51aec2-2b30-49d0-8aea-019bea882940\") " pod="openshift-image-registry/image-registry-697d97f7c8-ggvx6" Feb 24 10:12:12 crc kubenswrapper[4985]: I0224 10:12:12.844936 4985 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/b85b9142-2d6e-47e9-8d88-bed8bbaa6df7-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-25d4p\" (UID: \"b85b9142-2d6e-47e9-8d88-bed8bbaa6df7\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-25d4p" Feb 24 10:12:12 crc kubenswrapper[4985]: I0224 10:12:12.844953 4985 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/1ef1d68d-0ae5-4c63-b7b3-40e39ad30a87-profile-collector-cert\") pod \"olm-operator-6b444d44fb-mvvxj\" (UID: \"1ef1d68d-0ae5-4c63-b7b3-40e39ad30a87\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-mvvxj" Feb 24 10:12:12 crc kubenswrapper[4985]: I0224 10:12:12.844977 4985 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d5wzt\" (UniqueName: \"kubernetes.io/projected/4d3661f1-4dcc-48a4-9129-7052c5a2d098-kube-api-access-d5wzt\") pod \"packageserver-d55dfcdfc-cfp92\" (UID: \"4d3661f1-4dcc-48a4-9129-7052c5a2d098\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-cfp92" Feb 24 10:12:12 crc kubenswrapper[4985]: I0224 10:12:12.845002 4985 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/db31f9f1-19fc-47b3-8570-5428789afce0-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-mlncn\" (UID: \"db31f9f1-19fc-47b3-8570-5428789afce0\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-mlncn" Feb 24 10:12:12 crc kubenswrapper[4985]: I0224 10:12:12.845033 4985 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h9h7h\" (UniqueName: \"kubernetes.io/projected/122aaa41-07ff-451d-a71f-b343d6993525-kube-api-access-h9h7h\") pod \"machine-config-operator-74547568cd-pswjd\" (UID: \"122aaa41-07ff-451d-a71f-b343d6993525\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-pswjd" Feb 24 10:12:12 crc kubenswrapper[4985]: I0224 10:12:12.845056 4985 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8wmsl\" (UniqueName: \"kubernetes.io/projected/6ef771ef-28ac-46c6-925e-f12a7a70b6c3-kube-api-access-8wmsl\") pod \"marketplace-operator-79b997595-rlq9j\" (UID: \"6ef771ef-28ac-46c6-925e-f12a7a70b6c3\") " pod="openshift-marketplace/marketplace-operator-79b997595-rlq9j" Feb 24 10:12:12 crc kubenswrapper[4985]: I0224 10:12:12.845070 4985 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9jp65\" (UniqueName: \"kubernetes.io/projected/ad42793f-f18e-4948-95fe-84c7b2edcf9e-kube-api-access-9jp65\") pod \"service-ca-9c57cc56f-txrfz\" (UID: \"ad42793f-f18e-4948-95fe-84c7b2edcf9e\") " pod="openshift-service-ca/service-ca-9c57cc56f-txrfz" Feb 24 10:12:12 crc kubenswrapper[4985]: I0224 10:12:12.845094 4985 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/122aaa41-07ff-451d-a71f-b343d6993525-proxy-tls\") pod \"machine-config-operator-74547568cd-pswjd\" (UID: \"122aaa41-07ff-451d-a71f-b343d6993525\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-pswjd" Feb 24 10:12:12 crc kubenswrapper[4985]: I0224 10:12:12.845109 4985 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/4d3661f1-4dcc-48a4-9129-7052c5a2d098-webhook-cert\") pod \"packageserver-d55dfcdfc-cfp92\" (UID: \"4d3661f1-4dcc-48a4-9129-7052c5a2d098\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-cfp92" Feb 24 10:12:12 crc kubenswrapper[4985]: I0224 10:12:12.845139 4985 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9rbb7\" (UniqueName: \"kubernetes.io/projected/492804d6-fccf-4710-b64f-aef85fa80b2d-kube-api-access-9rbb7\") pod \"dns-default-w5s2c\" (UID: \"492804d6-fccf-4710-b64f-aef85fa80b2d\") " pod="openshift-dns/dns-default-w5s2c" Feb 24 10:12:12 crc kubenswrapper[4985]: I0224 10:12:12.845675 4985 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/45e61914-7807-43c3-b80b-a5b355b7447e-profile-collector-cert\") pod \"catalog-operator-68c6474976-2vcf4\" (UID: \"45e61914-7807-43c3-b80b-a5b355b7447e\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-2vcf4" Feb 24 10:12:12 crc kubenswrapper[4985]: I0224 10:12:12.845727 4985 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rmcsd\" (UniqueName: \"kubernetes.io/projected/2364f27b-fc75-4e56-8122-d6bdcc763b0a-kube-api-access-rmcsd\") pod \"collect-profiles-29532120-q5v4c\" (UID: \"2364f27b-fc75-4e56-8122-d6bdcc763b0a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29532120-q5v4c" Feb 24 10:12:12 crc kubenswrapper[4985]: I0224 10:12:12.845755 4985 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mdlkz\" (UniqueName: \"kubernetes.io/projected/b85b9142-2d6e-47e9-8d88-bed8bbaa6df7-kube-api-access-mdlkz\") pod \"cluster-image-registry-operator-dc59b4c8b-25d4p\" (UID: \"b85b9142-2d6e-47e9-8d88-bed8bbaa6df7\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-25d4p" Feb 24 10:12:12 crc kubenswrapper[4985]: I0224 10:12:12.845830 4985 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b85b9142-2d6e-47e9-8d88-bed8bbaa6df7-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-25d4p\" (UID: \"b85b9142-2d6e-47e9-8d88-bed8bbaa6df7\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-25d4p" Feb 24 10:12:12 crc kubenswrapper[4985]: I0224 10:12:12.845852 4985 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/da44f74b-f631-45cb-a8ad-78e36e49d2f1-config\") pod \"kube-controller-manager-operator-78b949d7b-4hthm\" (UID: \"da44f74b-f631-45cb-a8ad-78e36e49d2f1\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-4hthm" Feb 24 10:12:12 crc kubenswrapper[4985]: I0224 10:12:12.845876 4985 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/2364f27b-fc75-4e56-8122-d6bdcc763b0a-config-volume\") pod \"collect-profiles-29532120-q5v4c\" (UID: \"2364f27b-fc75-4e56-8122-d6bdcc763b0a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29532120-q5v4c" Feb 24 10:12:12 crc kubenswrapper[4985]: I0224 10:12:12.845900 4985 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/deefacd4-ae02-434c-bb74-2ac6b217e0e2-trusted-ca\") pod \"ingress-operator-5b745b69d9-lwqzt\" (UID: \"deefacd4-ae02-434c-bb74-2ac6b217e0e2\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-lwqzt" Feb 24 10:12:12 crc kubenswrapper[4985]: I0224 10:12:12.845948 4985 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ggvx6\" (UID: \"6d51aec2-2b30-49d0-8aea-019bea882940\") " pod="openshift-image-registry/image-registry-697d97f7c8-ggvx6" Feb 24 10:12:12 crc kubenswrapper[4985]: I0224 10:12:12.845974 4985 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wktk8\" (UniqueName: \"kubernetes.io/projected/139ca41b-7c70-4db2-a9bf-a5495fdeadbe-kube-api-access-wktk8\") pod \"service-ca-operator-777779d784-225jm\" (UID: \"139ca41b-7c70-4db2-a9bf-a5495fdeadbe\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-225jm" Feb 24 10:12:12 crc kubenswrapper[4985]: I0224 10:12:12.846023 4985 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/6d51aec2-2b30-49d0-8aea-019bea882940-installation-pull-secrets\") pod \"image-registry-697d97f7c8-ggvx6\" (UID: \"6d51aec2-2b30-49d0-8aea-019bea882940\") " pod="openshift-image-registry/image-registry-697d97f7c8-ggvx6" Feb 24 10:12:12 crc kubenswrapper[4985]: I0224 10:12:12.846044 4985 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/34aea390-2b22-43e4-aabd-8c3d13390620-proxy-tls\") pod \"machine-config-controller-84d6567774-ww7q7\" (UID: \"34aea390-2b22-43e4-aabd-8c3d13390620\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-ww7q7" Feb 24 10:12:12 crc kubenswrapper[4985]: I0224 10:12:12.846065 4985 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c2ddf07b-fdd4-4d99-a09c-256ed526960f-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-2hv9t\" (UID: \"c2ddf07b-fdd4-4d99-a09c-256ed526960f\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-2hv9t" Feb 24 10:12:12 crc kubenswrapper[4985]: I0224 10:12:12.846085 4985 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c9w9s\" (UniqueName: \"kubernetes.io/projected/4389db90-c320-4227-b4cc-efc14354ac37-kube-api-access-c9w9s\") pod \"package-server-manager-789f6589d5-hfkq2\" (UID: \"4389db90-c320-4227-b4cc-efc14354ac37\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-hfkq2" Feb 24 10:12:12 crc kubenswrapper[4985]: I0224 10:12:12.846106 4985 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/4d3661f1-4dcc-48a4-9129-7052c5a2d098-tmpfs\") pod \"packageserver-d55dfcdfc-cfp92\" (UID: \"4d3661f1-4dcc-48a4-9129-7052c5a2d098\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-cfp92" Feb 24 10:12:12 crc kubenswrapper[4985]: I0224 10:12:12.846179 4985 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/6ef771ef-28ac-46c6-925e-f12a7a70b6c3-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-rlq9j\" (UID: \"6ef771ef-28ac-46c6-925e-f12a7a70b6c3\") " pod="openshift-marketplace/marketplace-operator-79b997595-rlq9j" Feb 24 10:12:12 crc kubenswrapper[4985]: I0224 10:12:12.846236 4985 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/da44f74b-f631-45cb-a8ad-78e36e49d2f1-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-4hthm\" (UID: \"da44f74b-f631-45cb-a8ad-78e36e49d2f1\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-4hthm" Feb 24 10:12:12 crc kubenswrapper[4985]: I0224 10:12:12.846272 4985 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/6d51aec2-2b30-49d0-8aea-019bea882940-ca-trust-extracted\") pod \"image-registry-697d97f7c8-ggvx6\" (UID: \"6d51aec2-2b30-49d0-8aea-019bea882940\") " pod="openshift-image-registry/image-registry-697d97f7c8-ggvx6" Feb 24 10:12:12 crc kubenswrapper[4985]: I0224 10:12:12.846294 4985 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b6a6001a-7b27-4dc4-a1c5-5bbaa449dfe0-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-2xfms\" (UID: \"b6a6001a-7b27-4dc4-a1c5-5bbaa449dfe0\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-2xfms" Feb 24 10:12:12 crc kubenswrapper[4985]: I0224 10:12:12.846355 4985 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/4d3661f1-4dcc-48a4-9129-7052c5a2d098-apiservice-cert\") pod \"packageserver-d55dfcdfc-cfp92\" (UID: \"4d3661f1-4dcc-48a4-9129-7052c5a2d098\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-cfp92" Feb 24 10:12:12 crc kubenswrapper[4985]: I0224 10:12:12.846379 4985 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/6d51aec2-2b30-49d0-8aea-019bea882940-trusted-ca\") pod \"image-registry-697d97f7c8-ggvx6\" (UID: \"6d51aec2-2b30-49d0-8aea-019bea882940\") " pod="openshift-image-registry/image-registry-697d97f7c8-ggvx6" Feb 24 10:12:12 crc kubenswrapper[4985]: I0224 10:12:12.846446 4985 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x8rw2\" (UniqueName: \"kubernetes.io/projected/1ef1d68d-0ae5-4c63-b7b3-40e39ad30a87-kube-api-access-x8rw2\") pod \"olm-operator-6b444d44fb-mvvxj\" (UID: \"1ef1d68d-0ae5-4c63-b7b3-40e39ad30a87\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-mvvxj" Feb 24 10:12:12 crc kubenswrapper[4985]: E0224 10:12:12.851998 4985 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-24 10:12:13.351983561 +0000 UTC m=+217.826176121 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ggvx6" (UID: "6d51aec2-2b30-49d0-8aea-019bea882940") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 24 10:12:12 crc kubenswrapper[4985]: I0224 10:12:12.854170 4985 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-rmswq" Feb 24 10:12:12 crc kubenswrapper[4985]: I0224 10:12:12.855172 4985 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/492804d6-fccf-4710-b64f-aef85fa80b2d-config-volume\") pod \"dns-default-w5s2c\" (UID: \"492804d6-fccf-4710-b64f-aef85fa80b2d\") " pod="openshift-dns/dns-default-w5s2c" Feb 24 10:12:12 crc kubenswrapper[4985]: I0224 10:12:12.855236 4985 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gphtn\" (UniqueName: \"kubernetes.io/projected/45e61914-7807-43c3-b80b-a5b355b7447e-kube-api-access-gphtn\") pod \"catalog-operator-68c6474976-2vcf4\" (UID: \"45e61914-7807-43c3-b80b-a5b355b7447e\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-2vcf4" Feb 24 10:12:12 crc kubenswrapper[4985]: I0224 10:12:12.855258 4985 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/34aea390-2b22-43e4-aabd-8c3d13390620-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-ww7q7\" (UID: \"34aea390-2b22-43e4-aabd-8c3d13390620\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-ww7q7" Feb 24 10:12:12 crc kubenswrapper[4985]: I0224 10:12:12.963520 4985 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 24 10:12:12 crc kubenswrapper[4985]: I0224 10:12:12.964072 4985 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/4d3661f1-4dcc-48a4-9129-7052c5a2d098-apiservice-cert\") pod \"packageserver-d55dfcdfc-cfp92\" (UID: \"4d3661f1-4dcc-48a4-9129-7052c5a2d098\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-cfp92" Feb 24 10:12:12 crc kubenswrapper[4985]: I0224 10:12:12.964111 4985 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/7b4ae5a0-ee86-4518-b754-4b57da2dc152-mountpoint-dir\") pod \"csi-hostpathplugin-n5mkh\" (UID: \"7b4ae5a0-ee86-4518-b754-4b57da2dc152\") " pod="hostpath-provisioner/csi-hostpathplugin-n5mkh" Feb 24 10:12:12 crc kubenswrapper[4985]: I0224 10:12:12.964189 4985 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/6d51aec2-2b30-49d0-8aea-019bea882940-trusted-ca\") pod \"image-registry-697d97f7c8-ggvx6\" (UID: \"6d51aec2-2b30-49d0-8aea-019bea882940\") " pod="openshift-image-registry/image-registry-697d97f7c8-ggvx6" Feb 24 10:12:12 crc kubenswrapper[4985]: I0224 10:12:12.964219 4985 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x8rw2\" (UniqueName: \"kubernetes.io/projected/1ef1d68d-0ae5-4c63-b7b3-40e39ad30a87-kube-api-access-x8rw2\") pod \"olm-operator-6b444d44fb-mvvxj\" (UID: \"1ef1d68d-0ae5-4c63-b7b3-40e39ad30a87\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-mvvxj" Feb 24 10:12:12 crc kubenswrapper[4985]: I0224 10:12:12.964254 4985 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/492804d6-fccf-4710-b64f-aef85fa80b2d-config-volume\") pod \"dns-default-w5s2c\" (UID: \"492804d6-fccf-4710-b64f-aef85fa80b2d\") " pod="openshift-dns/dns-default-w5s2c" Feb 24 10:12:12 crc kubenswrapper[4985]: I0224 10:12:12.964302 4985 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gphtn\" (UniqueName: \"kubernetes.io/projected/45e61914-7807-43c3-b80b-a5b355b7447e-kube-api-access-gphtn\") pod \"catalog-operator-68c6474976-2vcf4\" (UID: \"45e61914-7807-43c3-b80b-a5b355b7447e\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-2vcf4" Feb 24 10:12:12 crc kubenswrapper[4985]: I0224 10:12:12.964335 4985 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/34aea390-2b22-43e4-aabd-8c3d13390620-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-ww7q7\" (UID: \"34aea390-2b22-43e4-aabd-8c3d13390620\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-ww7q7" Feb 24 10:12:12 crc kubenswrapper[4985]: I0224 10:12:12.964444 4985 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/122aaa41-07ff-451d-a71f-b343d6993525-auth-proxy-config\") pod \"machine-config-operator-74547568cd-pswjd\" (UID: \"122aaa41-07ff-451d-a71f-b343d6993525\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-pswjd" Feb 24 10:12:12 crc kubenswrapper[4985]: I0224 10:12:12.964494 4985 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/6ef771ef-28ac-46c6-925e-f12a7a70b6c3-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-rlq9j\" (UID: \"6ef771ef-28ac-46c6-925e-f12a7a70b6c3\") " pod="openshift-marketplace/marketplace-operator-79b997595-rlq9j" Feb 24 10:12:12 crc kubenswrapper[4985]: I0224 10:12:12.964564 4985 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/b85b9142-2d6e-47e9-8d88-bed8bbaa6df7-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-25d4p\" (UID: \"b85b9142-2d6e-47e9-8d88-bed8bbaa6df7\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-25d4p" Feb 24 10:12:12 crc kubenswrapper[4985]: I0224 10:12:12.964587 4985 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/ad42793f-f18e-4948-95fe-84c7b2edcf9e-signing-key\") pod \"service-ca-9c57cc56f-txrfz\" (UID: \"ad42793f-f18e-4948-95fe-84c7b2edcf9e\") " pod="openshift-service-ca/service-ca-9c57cc56f-txrfz" Feb 24 10:12:12 crc kubenswrapper[4985]: I0224 10:12:12.964606 4985 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/6d51aec2-2b30-49d0-8aea-019bea882940-bound-sa-token\") pod \"image-registry-697d97f7c8-ggvx6\" (UID: \"6d51aec2-2b30-49d0-8aea-019bea882940\") " pod="openshift-image-registry/image-registry-697d97f7c8-ggvx6" Feb 24 10:12:12 crc kubenswrapper[4985]: I0224 10:12:12.964629 4985 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/2364f27b-fc75-4e56-8122-d6bdcc763b0a-secret-volume\") pod \"collect-profiles-29532120-q5v4c\" (UID: \"2364f27b-fc75-4e56-8122-d6bdcc763b0a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29532120-q5v4c" Feb 24 10:12:12 crc kubenswrapper[4985]: I0224 10:12:12.964651 4985 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9t66n\" (UniqueName: \"kubernetes.io/projected/deefacd4-ae02-434c-bb74-2ac6b217e0e2-kube-api-access-9t66n\") pod \"ingress-operator-5b745b69d9-lwqzt\" (UID: \"deefacd4-ae02-434c-bb74-2ac6b217e0e2\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-lwqzt" Feb 24 10:12:12 crc kubenswrapper[4985]: I0224 10:12:12.964707 4985 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b6a6001a-7b27-4dc4-a1c5-5bbaa449dfe0-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-2xfms\" (UID: \"b6a6001a-7b27-4dc4-a1c5-5bbaa449dfe0\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-2xfms" Feb 24 10:12:12 crc kubenswrapper[4985]: I0224 10:12:12.964741 4985 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/6d51aec2-2b30-49d0-8aea-019bea882940-registry-tls\") pod \"image-registry-697d97f7c8-ggvx6\" (UID: \"6d51aec2-2b30-49d0-8aea-019bea882940\") " pod="openshift-image-registry/image-registry-697d97f7c8-ggvx6" Feb 24 10:12:12 crc kubenswrapper[4985]: I0224 10:12:12.964852 4985 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/deefacd4-ae02-434c-bb74-2ac6b217e0e2-metrics-tls\") pod \"ingress-operator-5b745b69d9-lwqzt\" (UID: \"deefacd4-ae02-434c-bb74-2ac6b217e0e2\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-lwqzt" Feb 24 10:12:12 crc kubenswrapper[4985]: I0224 10:12:12.964925 4985 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rsmd2\" (UniqueName: \"kubernetes.io/projected/db31f9f1-19fc-47b3-8570-5428789afce0-kube-api-access-rsmd2\") pod \"multus-admission-controller-857f4d67dd-mlncn\" (UID: \"db31f9f1-19fc-47b3-8570-5428789afce0\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-mlncn" Feb 24 10:12:12 crc kubenswrapper[4985]: I0224 10:12:12.964947 4985 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/ad42793f-f18e-4948-95fe-84c7b2edcf9e-signing-cabundle\") pod \"service-ca-9c57cc56f-txrfz\" (UID: \"ad42793f-f18e-4948-95fe-84c7b2edcf9e\") " pod="openshift-service-ca/service-ca-9c57cc56f-txrfz" Feb 24 10:12:12 crc kubenswrapper[4985]: I0224 10:12:12.964968 4985 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/da44f74b-f631-45cb-a8ad-78e36e49d2f1-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-4hthm\" (UID: \"da44f74b-f631-45cb-a8ad-78e36e49d2f1\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-4hthm" Feb 24 10:12:12 crc kubenswrapper[4985]: I0224 10:12:12.964992 4985 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f9zlg\" (UniqueName: \"kubernetes.io/projected/34aea390-2b22-43e4-aabd-8c3d13390620-kube-api-access-f9zlg\") pod \"machine-config-controller-84d6567774-ww7q7\" (UID: \"34aea390-2b22-43e4-aabd-8c3d13390620\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-ww7q7" Feb 24 10:12:12 crc kubenswrapper[4985]: I0224 10:12:12.965017 4985 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/ea70e899-59ad-4ef2-b407-81c67de39e50-node-bootstrap-token\") pod \"machine-config-server-6ssps\" (UID: \"ea70e899-59ad-4ef2-b407-81c67de39e50\") " pod="openshift-machine-config-operator/machine-config-server-6ssps" Feb 24 10:12:12 crc kubenswrapper[4985]: I0224 10:12:12.965036 4985 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/492804d6-fccf-4710-b64f-aef85fa80b2d-metrics-tls\") pod \"dns-default-w5s2c\" (UID: \"492804d6-fccf-4710-b64f-aef85fa80b2d\") " pod="openshift-dns/dns-default-w5s2c" Feb 24 10:12:12 crc kubenswrapper[4985]: I0224 10:12:12.965057 4985 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/c2ddf07b-fdd4-4d99-a09c-256ed526960f-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-2hv9t\" (UID: \"c2ddf07b-fdd4-4d99-a09c-256ed526960f\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-2hv9t" Feb 24 10:12:12 crc kubenswrapper[4985]: I0224 10:12:12.965104 4985 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5t4s9\" (UniqueName: \"kubernetes.io/projected/b6a6001a-7b27-4dc4-a1c5-5bbaa449dfe0-kube-api-access-5t4s9\") pod \"kube-storage-version-migrator-operator-b67b599dd-2xfms\" (UID: \"b6a6001a-7b27-4dc4-a1c5-5bbaa449dfe0\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-2xfms" Feb 24 10:12:12 crc kubenswrapper[4985]: I0224 10:12:12.965125 4985 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/deefacd4-ae02-434c-bb74-2ac6b217e0e2-bound-sa-token\") pod \"ingress-operator-5b745b69d9-lwqzt\" (UID: \"deefacd4-ae02-434c-bb74-2ac6b217e0e2\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-lwqzt" Feb 24 10:12:12 crc kubenswrapper[4985]: I0224 10:12:12.965162 4985 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/139ca41b-7c70-4db2-a9bf-a5495fdeadbe-config\") pod \"service-ca-operator-777779d784-225jm\" (UID: \"139ca41b-7c70-4db2-a9bf-a5495fdeadbe\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-225jm" Feb 24 10:12:12 crc kubenswrapper[4985]: I0224 10:12:12.965196 4985 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/122aaa41-07ff-451d-a71f-b343d6993525-images\") pod \"machine-config-operator-74547568cd-pswjd\" (UID: \"122aaa41-07ff-451d-a71f-b343d6993525\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-pswjd" Feb 24 10:12:12 crc kubenswrapper[4985]: I0224 10:12:12.965217 4985 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/7b4ae5a0-ee86-4518-b754-4b57da2dc152-plugins-dir\") pod \"csi-hostpathplugin-n5mkh\" (UID: \"7b4ae5a0-ee86-4518-b754-4b57da2dc152\") " pod="hostpath-provisioner/csi-hostpathplugin-n5mkh" Feb 24 10:12:12 crc kubenswrapper[4985]: I0224 10:12:12.965241 4985 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/7b4ae5a0-ee86-4518-b754-4b57da2dc152-registration-dir\") pod \"csi-hostpathplugin-n5mkh\" (UID: \"7b4ae5a0-ee86-4518-b754-4b57da2dc152\") " pod="hostpath-provisioner/csi-hostpathplugin-n5mkh" Feb 24 10:12:12 crc kubenswrapper[4985]: I0224 10:12:12.965260 4985 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rxnmj\" (UniqueName: \"kubernetes.io/projected/6d51aec2-2b30-49d0-8aea-019bea882940-kube-api-access-rxnmj\") pod \"image-registry-697d97f7c8-ggvx6\" (UID: \"6d51aec2-2b30-49d0-8aea-019bea882940\") " pod="openshift-image-registry/image-registry-697d97f7c8-ggvx6" Feb 24 10:12:12 crc kubenswrapper[4985]: I0224 10:12:12.965282 4985 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/139ca41b-7c70-4db2-a9bf-a5495fdeadbe-serving-cert\") pod \"service-ca-operator-777779d784-225jm\" (UID: \"139ca41b-7c70-4db2-a9bf-a5495fdeadbe\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-225jm" Feb 24 10:12:12 crc kubenswrapper[4985]: I0224 10:12:12.965304 4985 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/4389db90-c320-4227-b4cc-efc14354ac37-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-hfkq2\" (UID: \"4389db90-c320-4227-b4cc-efc14354ac37\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-hfkq2" Feb 24 10:12:12 crc kubenswrapper[4985]: I0224 10:12:12.965325 4985 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/7b4ae5a0-ee86-4518-b754-4b57da2dc152-socket-dir\") pod \"csi-hostpathplugin-n5mkh\" (UID: \"7b4ae5a0-ee86-4518-b754-4b57da2dc152\") " pod="hostpath-provisioner/csi-hostpathplugin-n5mkh" Feb 24 10:12:12 crc kubenswrapper[4985]: I0224 10:12:12.965347 4985 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/45e61914-7807-43c3-b80b-a5b355b7447e-srv-cert\") pod \"catalog-operator-68c6474976-2vcf4\" (UID: \"45e61914-7807-43c3-b80b-a5b355b7447e\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-2vcf4" Feb 24 10:12:12 crc kubenswrapper[4985]: I0224 10:12:12.965403 4985 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c2ddf07b-fdd4-4d99-a09c-256ed526960f-config\") pod \"kube-apiserver-operator-766d6c64bb-2hv9t\" (UID: \"c2ddf07b-fdd4-4d99-a09c-256ed526960f\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-2hv9t" Feb 24 10:12:12 crc kubenswrapper[4985]: I0224 10:12:12.965425 4985 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/1ef1d68d-0ae5-4c63-b7b3-40e39ad30a87-srv-cert\") pod \"olm-operator-6b444d44fb-mvvxj\" (UID: \"1ef1d68d-0ae5-4c63-b7b3-40e39ad30a87\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-mvvxj" Feb 24 10:12:12 crc kubenswrapper[4985]: I0224 10:12:12.965445 4985 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/6d51aec2-2b30-49d0-8aea-019bea882940-registry-certificates\") pod \"image-registry-697d97f7c8-ggvx6\" (UID: \"6d51aec2-2b30-49d0-8aea-019bea882940\") " pod="openshift-image-registry/image-registry-697d97f7c8-ggvx6" Feb 24 10:12:12 crc kubenswrapper[4985]: I0224 10:12:12.965467 4985 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/b85b9142-2d6e-47e9-8d88-bed8bbaa6df7-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-25d4p\" (UID: \"b85b9142-2d6e-47e9-8d88-bed8bbaa6df7\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-25d4p" Feb 24 10:12:12 crc kubenswrapper[4985]: I0224 10:12:12.965518 4985 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/1ef1d68d-0ae5-4c63-b7b3-40e39ad30a87-profile-collector-cert\") pod \"olm-operator-6b444d44fb-mvvxj\" (UID: \"1ef1d68d-0ae5-4c63-b7b3-40e39ad30a87\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-mvvxj" Feb 24 10:12:12 crc kubenswrapper[4985]: I0224 10:12:12.965554 4985 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d5wzt\" (UniqueName: \"kubernetes.io/projected/4d3661f1-4dcc-48a4-9129-7052c5a2d098-kube-api-access-d5wzt\") pod \"packageserver-d55dfcdfc-cfp92\" (UID: \"4d3661f1-4dcc-48a4-9129-7052c5a2d098\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-cfp92" Feb 24 10:12:12 crc kubenswrapper[4985]: I0224 10:12:12.965588 4985 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/db31f9f1-19fc-47b3-8570-5428789afce0-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-mlncn\" (UID: \"db31f9f1-19fc-47b3-8570-5428789afce0\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-mlncn" Feb 24 10:12:12 crc kubenswrapper[4985]: I0224 10:12:12.965622 4985 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h9h7h\" (UniqueName: \"kubernetes.io/projected/122aaa41-07ff-451d-a71f-b343d6993525-kube-api-access-h9h7h\") pod \"machine-config-operator-74547568cd-pswjd\" (UID: \"122aaa41-07ff-451d-a71f-b343d6993525\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-pswjd" Feb 24 10:12:12 crc kubenswrapper[4985]: I0224 10:12:12.965642 4985 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9jp65\" (UniqueName: \"kubernetes.io/projected/ad42793f-f18e-4948-95fe-84c7b2edcf9e-kube-api-access-9jp65\") pod \"service-ca-9c57cc56f-txrfz\" (UID: \"ad42793f-f18e-4948-95fe-84c7b2edcf9e\") " pod="openshift-service-ca/service-ca-9c57cc56f-txrfz" Feb 24 10:12:12 crc kubenswrapper[4985]: I0224 10:12:12.965660 4985 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8wmsl\" (UniqueName: \"kubernetes.io/projected/6ef771ef-28ac-46c6-925e-f12a7a70b6c3-kube-api-access-8wmsl\") pod \"marketplace-operator-79b997595-rlq9j\" (UID: \"6ef771ef-28ac-46c6-925e-f12a7a70b6c3\") " pod="openshift-marketplace/marketplace-operator-79b997595-rlq9j" Feb 24 10:12:12 crc kubenswrapper[4985]: I0224 10:12:12.965680 4985 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n5g96\" (UniqueName: \"kubernetes.io/projected/ea70e899-59ad-4ef2-b407-81c67de39e50-kube-api-access-n5g96\") pod \"machine-config-server-6ssps\" (UID: \"ea70e899-59ad-4ef2-b407-81c67de39e50\") " pod="openshift-machine-config-operator/machine-config-server-6ssps" Feb 24 10:12:12 crc kubenswrapper[4985]: I0224 10:12:12.965702 4985 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/122aaa41-07ff-451d-a71f-b343d6993525-proxy-tls\") pod \"machine-config-operator-74547568cd-pswjd\" (UID: \"122aaa41-07ff-451d-a71f-b343d6993525\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-pswjd" Feb 24 10:12:12 crc kubenswrapper[4985]: I0224 10:12:12.965722 4985 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/4d3661f1-4dcc-48a4-9129-7052c5a2d098-webhook-cert\") pod \"packageserver-d55dfcdfc-cfp92\" (UID: \"4d3661f1-4dcc-48a4-9129-7052c5a2d098\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-cfp92" Feb 24 10:12:12 crc kubenswrapper[4985]: I0224 10:12:12.965752 4985 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9rbb7\" (UniqueName: \"kubernetes.io/projected/492804d6-fccf-4710-b64f-aef85fa80b2d-kube-api-access-9rbb7\") pod \"dns-default-w5s2c\" (UID: \"492804d6-fccf-4710-b64f-aef85fa80b2d\") " pod="openshift-dns/dns-default-w5s2c" Feb 24 10:12:12 crc kubenswrapper[4985]: I0224 10:12:12.965774 4985 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/45e61914-7807-43c3-b80b-a5b355b7447e-profile-collector-cert\") pod \"catalog-operator-68c6474976-2vcf4\" (UID: \"45e61914-7807-43c3-b80b-a5b355b7447e\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-2vcf4" Feb 24 10:12:12 crc kubenswrapper[4985]: I0224 10:12:12.965833 4985 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rmcsd\" (UniqueName: \"kubernetes.io/projected/2364f27b-fc75-4e56-8122-d6bdcc763b0a-kube-api-access-rmcsd\") pod \"collect-profiles-29532120-q5v4c\" (UID: \"2364f27b-fc75-4e56-8122-d6bdcc763b0a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29532120-q5v4c" Feb 24 10:12:12 crc kubenswrapper[4985]: I0224 10:12:12.965855 4985 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mdlkz\" (UniqueName: \"kubernetes.io/projected/b85b9142-2d6e-47e9-8d88-bed8bbaa6df7-kube-api-access-mdlkz\") pod \"cluster-image-registry-operator-dc59b4c8b-25d4p\" (UID: \"b85b9142-2d6e-47e9-8d88-bed8bbaa6df7\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-25d4p" Feb 24 10:12:12 crc kubenswrapper[4985]: I0224 10:12:12.965929 4985 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/ea70e899-59ad-4ef2-b407-81c67de39e50-certs\") pod \"machine-config-server-6ssps\" (UID: \"ea70e899-59ad-4ef2-b407-81c67de39e50\") " pod="openshift-machine-config-operator/machine-config-server-6ssps" Feb 24 10:12:12 crc kubenswrapper[4985]: I0224 10:12:12.965957 4985 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b85b9142-2d6e-47e9-8d88-bed8bbaa6df7-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-25d4p\" (UID: \"b85b9142-2d6e-47e9-8d88-bed8bbaa6df7\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-25d4p" Feb 24 10:12:12 crc kubenswrapper[4985]: I0224 10:12:12.965977 4985 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/da44f74b-f631-45cb-a8ad-78e36e49d2f1-config\") pod \"kube-controller-manager-operator-78b949d7b-4hthm\" (UID: \"da44f74b-f631-45cb-a8ad-78e36e49d2f1\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-4hthm" Feb 24 10:12:12 crc kubenswrapper[4985]: I0224 10:12:12.965997 4985 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/2364f27b-fc75-4e56-8122-d6bdcc763b0a-config-volume\") pod \"collect-profiles-29532120-q5v4c\" (UID: \"2364f27b-fc75-4e56-8122-d6bdcc763b0a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29532120-q5v4c" Feb 24 10:12:12 crc kubenswrapper[4985]: I0224 10:12:12.966031 4985 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rw2gs\" (UniqueName: \"kubernetes.io/projected/7b4ae5a0-ee86-4518-b754-4b57da2dc152-kube-api-access-rw2gs\") pod \"csi-hostpathplugin-n5mkh\" (UID: \"7b4ae5a0-ee86-4518-b754-4b57da2dc152\") " pod="hostpath-provisioner/csi-hostpathplugin-n5mkh" Feb 24 10:12:12 crc kubenswrapper[4985]: I0224 10:12:12.966054 4985 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/deefacd4-ae02-434c-bb74-2ac6b217e0e2-trusted-ca\") pod \"ingress-operator-5b745b69d9-lwqzt\" (UID: \"deefacd4-ae02-434c-bb74-2ac6b217e0e2\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-lwqzt" Feb 24 10:12:12 crc kubenswrapper[4985]: I0224 10:12:12.966084 4985 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wktk8\" (UniqueName: \"kubernetes.io/projected/139ca41b-7c70-4db2-a9bf-a5495fdeadbe-kube-api-access-wktk8\") pod \"service-ca-operator-777779d784-225jm\" (UID: \"139ca41b-7c70-4db2-a9bf-a5495fdeadbe\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-225jm" Feb 24 10:12:12 crc kubenswrapper[4985]: I0224 10:12:12.966118 4985 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/6d51aec2-2b30-49d0-8aea-019bea882940-installation-pull-secrets\") pod \"image-registry-697d97f7c8-ggvx6\" (UID: \"6d51aec2-2b30-49d0-8aea-019bea882940\") " pod="openshift-image-registry/image-registry-697d97f7c8-ggvx6" Feb 24 10:12:12 crc kubenswrapper[4985]: I0224 10:12:12.966144 4985 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/7b4ae5a0-ee86-4518-b754-4b57da2dc152-csi-data-dir\") pod \"csi-hostpathplugin-n5mkh\" (UID: \"7b4ae5a0-ee86-4518-b754-4b57da2dc152\") " pod="hostpath-provisioner/csi-hostpathplugin-n5mkh" Feb 24 10:12:12 crc kubenswrapper[4985]: I0224 10:12:12.966168 4985 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/34aea390-2b22-43e4-aabd-8c3d13390620-proxy-tls\") pod \"machine-config-controller-84d6567774-ww7q7\" (UID: \"34aea390-2b22-43e4-aabd-8c3d13390620\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-ww7q7" Feb 24 10:12:12 crc kubenswrapper[4985]: I0224 10:12:12.966191 4985 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c2ddf07b-fdd4-4d99-a09c-256ed526960f-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-2hv9t\" (UID: \"c2ddf07b-fdd4-4d99-a09c-256ed526960f\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-2hv9t" Feb 24 10:12:12 crc kubenswrapper[4985]: I0224 10:12:12.966211 4985 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c9w9s\" (UniqueName: \"kubernetes.io/projected/4389db90-c320-4227-b4cc-efc14354ac37-kube-api-access-c9w9s\") pod \"package-server-manager-789f6589d5-hfkq2\" (UID: \"4389db90-c320-4227-b4cc-efc14354ac37\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-hfkq2" Feb 24 10:12:12 crc kubenswrapper[4985]: I0224 10:12:12.966246 4985 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/4d3661f1-4dcc-48a4-9129-7052c5a2d098-tmpfs\") pod \"packageserver-d55dfcdfc-cfp92\" (UID: \"4d3661f1-4dcc-48a4-9129-7052c5a2d098\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-cfp92" Feb 24 10:12:12 crc kubenswrapper[4985]: I0224 10:12:12.966328 4985 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/6ef771ef-28ac-46c6-925e-f12a7a70b6c3-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-rlq9j\" (UID: \"6ef771ef-28ac-46c6-925e-f12a7a70b6c3\") " pod="openshift-marketplace/marketplace-operator-79b997595-rlq9j" Feb 24 10:12:12 crc kubenswrapper[4985]: I0224 10:12:12.966348 4985 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/da44f74b-f631-45cb-a8ad-78e36e49d2f1-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-4hthm\" (UID: \"da44f74b-f631-45cb-a8ad-78e36e49d2f1\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-4hthm" Feb 24 10:12:12 crc kubenswrapper[4985]: I0224 10:12:12.966369 4985 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/6d51aec2-2b30-49d0-8aea-019bea882940-ca-trust-extracted\") pod \"image-registry-697d97f7c8-ggvx6\" (UID: \"6d51aec2-2b30-49d0-8aea-019bea882940\") " pod="openshift-image-registry/image-registry-697d97f7c8-ggvx6" Feb 24 10:12:12 crc kubenswrapper[4985]: I0224 10:12:12.966391 4985 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b6a6001a-7b27-4dc4-a1c5-5bbaa449dfe0-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-2xfms\" (UID: \"b6a6001a-7b27-4dc4-a1c5-5bbaa449dfe0\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-2xfms" Feb 24 10:12:12 crc kubenswrapper[4985]: I0224 10:12:12.968041 4985 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/122aaa41-07ff-451d-a71f-b343d6993525-images\") pod \"machine-config-operator-74547568cd-pswjd\" (UID: \"122aaa41-07ff-451d-a71f-b343d6993525\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-pswjd" Feb 24 10:12:12 crc kubenswrapper[4985]: I0224 10:12:12.972451 4985 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/139ca41b-7c70-4db2-a9bf-a5495fdeadbe-serving-cert\") pod \"service-ca-operator-777779d784-225jm\" (UID: \"139ca41b-7c70-4db2-a9bf-a5495fdeadbe\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-225jm" Feb 24 10:12:12 crc kubenswrapper[4985]: I0224 10:12:12.976407 4985 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/6d51aec2-2b30-49d0-8aea-019bea882940-trusted-ca\") pod \"image-registry-697d97f7c8-ggvx6\" (UID: \"6d51aec2-2b30-49d0-8aea-019bea882940\") " pod="openshift-image-registry/image-registry-697d97f7c8-ggvx6" Feb 24 10:12:12 crc kubenswrapper[4985]: I0224 10:12:12.977018 4985 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/492804d6-fccf-4710-b64f-aef85fa80b2d-config-volume\") pod \"dns-default-w5s2c\" (UID: \"492804d6-fccf-4710-b64f-aef85fa80b2d\") " pod="openshift-dns/dns-default-w5s2c" Feb 24 10:12:12 crc kubenswrapper[4985]: I0224 10:12:12.988784 4985 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/2364f27b-fc75-4e56-8122-d6bdcc763b0a-secret-volume\") pod \"collect-profiles-29532120-q5v4c\" (UID: \"2364f27b-fc75-4e56-8122-d6bdcc763b0a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29532120-q5v4c" Feb 24 10:12:12 crc kubenswrapper[4985]: I0224 10:12:12.992800 4985 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/34aea390-2b22-43e4-aabd-8c3d13390620-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-ww7q7\" (UID: \"34aea390-2b22-43e4-aabd-8c3d13390620\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-ww7q7" Feb 24 10:12:12 crc kubenswrapper[4985]: I0224 10:12:12.993480 4985 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/ad42793f-f18e-4948-95fe-84c7b2edcf9e-signing-cabundle\") pod \"service-ca-9c57cc56f-txrfz\" (UID: \"ad42793f-f18e-4948-95fe-84c7b2edcf9e\") " pod="openshift-service-ca/service-ca-9c57cc56f-txrfz" Feb 24 10:12:13 crc kubenswrapper[4985]: I0224 10:12:13.034663 4985 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b6a6001a-7b27-4dc4-a1c5-5bbaa449dfe0-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-2xfms\" (UID: \"b6a6001a-7b27-4dc4-a1c5-5bbaa449dfe0\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-2xfms" Feb 24 10:12:13 crc kubenswrapper[4985]: I0224 10:12:13.041487 4985 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/4389db90-c320-4227-b4cc-efc14354ac37-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-hfkq2\" (UID: \"4389db90-c320-4227-b4cc-efc14354ac37\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-hfkq2" Feb 24 10:12:13 crc kubenswrapper[4985]: I0224 10:12:13.041832 4985 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/deefacd4-ae02-434c-bb74-2ac6b217e0e2-trusted-ca\") pod \"ingress-operator-5b745b69d9-lwqzt\" (UID: \"deefacd4-ae02-434c-bb74-2ac6b217e0e2\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-lwqzt" Feb 24 10:12:13 crc kubenswrapper[4985]: I0224 10:12:13.042881 4985 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/1ef1d68d-0ae5-4c63-b7b3-40e39ad30a87-srv-cert\") pod \"olm-operator-6b444d44fb-mvvxj\" (UID: \"1ef1d68d-0ae5-4c63-b7b3-40e39ad30a87\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-mvvxj" Feb 24 10:12:13 crc kubenswrapper[4985]: I0224 10:12:13.043848 4985 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/4d3661f1-4dcc-48a4-9129-7052c5a2d098-apiservice-cert\") pod \"packageserver-d55dfcdfc-cfp92\" (UID: \"4d3661f1-4dcc-48a4-9129-7052c5a2d098\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-cfp92" Feb 24 10:12:13 crc kubenswrapper[4985]: I0224 10:12:13.045092 4985 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/122aaa41-07ff-451d-a71f-b343d6993525-auth-proxy-config\") pod \"machine-config-operator-74547568cd-pswjd\" (UID: \"122aaa41-07ff-451d-a71f-b343d6993525\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-pswjd" Feb 24 10:12:13 crc kubenswrapper[4985]: I0224 10:12:13.045255 4985 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/db31f9f1-19fc-47b3-8570-5428789afce0-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-mlncn\" (UID: \"db31f9f1-19fc-47b3-8570-5428789afce0\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-mlncn" Feb 24 10:12:13 crc kubenswrapper[4985]: I0224 10:12:13.045570 4985 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/492804d6-fccf-4710-b64f-aef85fa80b2d-metrics-tls\") pod \"dns-default-w5s2c\" (UID: \"492804d6-fccf-4710-b64f-aef85fa80b2d\") " pod="openshift-dns/dns-default-w5s2c" Feb 24 10:12:13 crc kubenswrapper[4985]: I0224 10:12:13.046079 4985 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/45e61914-7807-43c3-b80b-a5b355b7447e-srv-cert\") pod \"catalog-operator-68c6474976-2vcf4\" (UID: \"45e61914-7807-43c3-b80b-a5b355b7447e\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-2vcf4" Feb 24 10:12:13 crc kubenswrapper[4985]: I0224 10:12:13.046205 4985 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/6d51aec2-2b30-49d0-8aea-019bea882940-installation-pull-secrets\") pod \"image-registry-697d97f7c8-ggvx6\" (UID: \"6d51aec2-2b30-49d0-8aea-019bea882940\") " pod="openshift-image-registry/image-registry-697d97f7c8-ggvx6" Feb 24 10:12:13 crc kubenswrapper[4985]: I0224 10:12:13.047211 4985 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/ad42793f-f18e-4948-95fe-84c7b2edcf9e-signing-key\") pod \"service-ca-9c57cc56f-txrfz\" (UID: \"ad42793f-f18e-4948-95fe-84c7b2edcf9e\") " pod="openshift-service-ca/service-ca-9c57cc56f-txrfz" Feb 24 10:12:13 crc kubenswrapper[4985]: I0224 10:12:13.048168 4985 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c2ddf07b-fdd4-4d99-a09c-256ed526960f-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-2hv9t\" (UID: \"c2ddf07b-fdd4-4d99-a09c-256ed526960f\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-2hv9t" Feb 24 10:12:13 crc kubenswrapper[4985]: I0224 10:12:13.049492 4985 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/6d51aec2-2b30-49d0-8aea-019bea882940-registry-certificates\") pod \"image-registry-697d97f7c8-ggvx6\" (UID: \"6d51aec2-2b30-49d0-8aea-019bea882940\") " pod="openshift-image-registry/image-registry-697d97f7c8-ggvx6" Feb 24 10:12:13 crc kubenswrapper[4985]: I0224 10:12:13.050991 4985 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/6d51aec2-2b30-49d0-8aea-019bea882940-ca-trust-extracted\") pod \"image-registry-697d97f7c8-ggvx6\" (UID: \"6d51aec2-2b30-49d0-8aea-019bea882940\") " pod="openshift-image-registry/image-registry-697d97f7c8-ggvx6" Feb 24 10:12:13 crc kubenswrapper[4985]: I0224 10:12:13.051775 4985 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/4d3661f1-4dcc-48a4-9129-7052c5a2d098-tmpfs\") pod \"packageserver-d55dfcdfc-cfp92\" (UID: \"4d3661f1-4dcc-48a4-9129-7052c5a2d098\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-cfp92" Feb 24 10:12:13 crc kubenswrapper[4985]: I0224 10:12:13.055612 4985 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c2ddf07b-fdd4-4d99-a09c-256ed526960f-config\") pod \"kube-apiserver-operator-766d6c64bb-2hv9t\" (UID: \"c2ddf07b-fdd4-4d99-a09c-256ed526960f\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-2hv9t" Feb 24 10:12:13 crc kubenswrapper[4985]: I0224 10:12:13.071298 4985 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/6ef771ef-28ac-46c6-925e-f12a7a70b6c3-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-rlq9j\" (UID: \"6ef771ef-28ac-46c6-925e-f12a7a70b6c3\") " pod="openshift-marketplace/marketplace-operator-79b997595-rlq9j" Feb 24 10:12:13 crc kubenswrapper[4985]: I0224 10:12:13.079791 4985 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b85b9142-2d6e-47e9-8d88-bed8bbaa6df7-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-25d4p\" (UID: \"b85b9142-2d6e-47e9-8d88-bed8bbaa6df7\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-25d4p" Feb 24 10:12:13 crc kubenswrapper[4985]: I0224 10:12:13.080336 4985 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/1ef1d68d-0ae5-4c63-b7b3-40e39ad30a87-profile-collector-cert\") pod \"olm-operator-6b444d44fb-mvvxj\" (UID: \"1ef1d68d-0ae5-4c63-b7b3-40e39ad30a87\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-mvvxj" Feb 24 10:12:13 crc kubenswrapper[4985]: I0224 10:12:13.081224 4985 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b6a6001a-7b27-4dc4-a1c5-5bbaa449dfe0-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-2xfms\" (UID: \"b6a6001a-7b27-4dc4-a1c5-5bbaa449dfe0\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-2xfms" Feb 24 10:12:13 crc kubenswrapper[4985]: I0224 10:12:13.081706 4985 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/6d51aec2-2b30-49d0-8aea-019bea882940-registry-tls\") pod \"image-registry-697d97f7c8-ggvx6\" (UID: \"6d51aec2-2b30-49d0-8aea-019bea882940\") " pod="openshift-image-registry/image-registry-697d97f7c8-ggvx6" Feb 24 10:12:13 crc kubenswrapper[4985]: I0224 10:12:13.082798 4985 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/da44f74b-f631-45cb-a8ad-78e36e49d2f1-config\") pod \"kube-controller-manager-operator-78b949d7b-4hthm\" (UID: \"da44f74b-f631-45cb-a8ad-78e36e49d2f1\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-4hthm" Feb 24 10:12:13 crc kubenswrapper[4985]: E0224 10:12:13.084625 4985 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-24 10:12:13.584610106 +0000 UTC m=+218.058802656 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 24 10:12:13 crc kubenswrapper[4985]: I0224 10:12:13.084697 4985 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/7b4ae5a0-ee86-4518-b754-4b57da2dc152-mountpoint-dir\") pod \"csi-hostpathplugin-n5mkh\" (UID: \"7b4ae5a0-ee86-4518-b754-4b57da2dc152\") " pod="hostpath-provisioner/csi-hostpathplugin-n5mkh" Feb 24 10:12:13 crc kubenswrapper[4985]: I0224 10:12:13.084747 4985 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/7b4ae5a0-ee86-4518-b754-4b57da2dc152-mountpoint-dir\") pod \"csi-hostpathplugin-n5mkh\" (UID: \"7b4ae5a0-ee86-4518-b754-4b57da2dc152\") " pod="hostpath-provisioner/csi-hostpathplugin-n5mkh" Feb 24 10:12:13 crc kubenswrapper[4985]: I0224 10:12:13.085735 4985 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/7b4ae5a0-ee86-4518-b754-4b57da2dc152-registration-dir\") pod \"csi-hostpathplugin-n5mkh\" (UID: \"7b4ae5a0-ee86-4518-b754-4b57da2dc152\") " pod="hostpath-provisioner/csi-hostpathplugin-n5mkh" Feb 24 10:12:13 crc kubenswrapper[4985]: I0224 10:12:13.085793 4985 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/7b4ae5a0-ee86-4518-b754-4b57da2dc152-plugins-dir\") pod \"csi-hostpathplugin-n5mkh\" (UID: \"7b4ae5a0-ee86-4518-b754-4b57da2dc152\") " pod="hostpath-provisioner/csi-hostpathplugin-n5mkh" Feb 24 10:12:13 crc kubenswrapper[4985]: I0224 10:12:13.086292 4985 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/7b4ae5a0-ee86-4518-b754-4b57da2dc152-plugins-dir\") pod \"csi-hostpathplugin-n5mkh\" (UID: \"7b4ae5a0-ee86-4518-b754-4b57da2dc152\") " pod="hostpath-provisioner/csi-hostpathplugin-n5mkh" Feb 24 10:12:13 crc kubenswrapper[4985]: I0224 10:12:13.086352 4985 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/7b4ae5a0-ee86-4518-b754-4b57da2dc152-registration-dir\") pod \"csi-hostpathplugin-n5mkh\" (UID: \"7b4ae5a0-ee86-4518-b754-4b57da2dc152\") " pod="hostpath-provisioner/csi-hostpathplugin-n5mkh" Feb 24 10:12:13 crc kubenswrapper[4985]: I0224 10:12:13.089183 4985 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/34aea390-2b22-43e4-aabd-8c3d13390620-proxy-tls\") pod \"machine-config-controller-84d6567774-ww7q7\" (UID: \"34aea390-2b22-43e4-aabd-8c3d13390620\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-ww7q7" Feb 24 10:12:13 crc kubenswrapper[4985]: I0224 10:12:13.094073 4985 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/2364f27b-fc75-4e56-8122-d6bdcc763b0a-config-volume\") pod \"collect-profiles-29532120-q5v4c\" (UID: \"2364f27b-fc75-4e56-8122-d6bdcc763b0a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29532120-q5v4c" Feb 24 10:12:13 crc kubenswrapper[4985]: I0224 10:12:13.094239 4985 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/122aaa41-07ff-451d-a71f-b343d6993525-proxy-tls\") pod \"machine-config-operator-74547568cd-pswjd\" (UID: \"122aaa41-07ff-451d-a71f-b343d6993525\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-pswjd" Feb 24 10:12:13 crc kubenswrapper[4985]: I0224 10:12:13.095045 4985 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rxnmj\" (UniqueName: \"kubernetes.io/projected/6d51aec2-2b30-49d0-8aea-019bea882940-kube-api-access-rxnmj\") pod \"image-registry-697d97f7c8-ggvx6\" (UID: \"6d51aec2-2b30-49d0-8aea-019bea882940\") " pod="openshift-image-registry/image-registry-697d97f7c8-ggvx6" Feb 24 10:12:13 crc kubenswrapper[4985]: I0224 10:12:13.097146 4985 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/4d3661f1-4dcc-48a4-9129-7052c5a2d098-webhook-cert\") pod \"packageserver-d55dfcdfc-cfp92\" (UID: \"4d3661f1-4dcc-48a4-9129-7052c5a2d098\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-cfp92" Feb 24 10:12:13 crc kubenswrapper[4985]: I0224 10:12:13.064900 4985 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/139ca41b-7c70-4db2-a9bf-a5495fdeadbe-config\") pod \"service-ca-operator-777779d784-225jm\" (UID: \"139ca41b-7c70-4db2-a9bf-a5495fdeadbe\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-225jm" Feb 24 10:12:13 crc kubenswrapper[4985]: I0224 10:12:13.097878 4985 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-jhlml"] Feb 24 10:12:13 crc kubenswrapper[4985]: I0224 10:12:13.098089 4985 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9t66n\" (UniqueName: \"kubernetes.io/projected/deefacd4-ae02-434c-bb74-2ac6b217e0e2-kube-api-access-9t66n\") pod \"ingress-operator-5b745b69d9-lwqzt\" (UID: \"deefacd4-ae02-434c-bb74-2ac6b217e0e2\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-lwqzt" Feb 24 10:12:13 crc kubenswrapper[4985]: I0224 10:12:13.102551 4985 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/6d51aec2-2b30-49d0-8aea-019bea882940-bound-sa-token\") pod \"image-registry-697d97f7c8-ggvx6\" (UID: \"6d51aec2-2b30-49d0-8aea-019bea882940\") " pod="openshift-image-registry/image-registry-697d97f7c8-ggvx6" Feb 24 10:12:13 crc kubenswrapper[4985]: I0224 10:12:13.108300 4985 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/da44f74b-f631-45cb-a8ad-78e36e49d2f1-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-4hthm\" (UID: \"da44f74b-f631-45cb-a8ad-78e36e49d2f1\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-4hthm" Feb 24 10:12:13 crc kubenswrapper[4985]: I0224 10:12:13.109662 4985 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/b85b9142-2d6e-47e9-8d88-bed8bbaa6df7-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-25d4p\" (UID: \"b85b9142-2d6e-47e9-8d88-bed8bbaa6df7\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-25d4p" Feb 24 10:12:13 crc kubenswrapper[4985]: I0224 10:12:13.112304 4985 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/deefacd4-ae02-434c-bb74-2ac6b217e0e2-metrics-tls\") pod \"ingress-operator-5b745b69d9-lwqzt\" (UID: \"deefacd4-ae02-434c-bb74-2ac6b217e0e2\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-lwqzt" Feb 24 10:12:13 crc kubenswrapper[4985]: I0224 10:12:13.121401 4985 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/c2ddf07b-fdd4-4d99-a09c-256ed526960f-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-2hv9t\" (UID: \"c2ddf07b-fdd4-4d99-a09c-256ed526960f\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-2hv9t" Feb 24 10:12:13 crc kubenswrapper[4985]: I0224 10:12:13.121882 4985 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/6ef771ef-28ac-46c6-925e-f12a7a70b6c3-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-rlq9j\" (UID: \"6ef771ef-28ac-46c6-925e-f12a7a70b6c3\") " pod="openshift-marketplace/marketplace-operator-79b997595-rlq9j" Feb 24 10:12:13 crc kubenswrapper[4985]: I0224 10:12:13.124443 4985 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d5wzt\" (UniqueName: \"kubernetes.io/projected/4d3661f1-4dcc-48a4-9129-7052c5a2d098-kube-api-access-d5wzt\") pod \"packageserver-d55dfcdfc-cfp92\" (UID: \"4d3661f1-4dcc-48a4-9129-7052c5a2d098\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-cfp92" Feb 24 10:12:13 crc kubenswrapper[4985]: I0224 10:12:13.127741 4985 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5t4s9\" (UniqueName: \"kubernetes.io/projected/b6a6001a-7b27-4dc4-a1c5-5bbaa449dfe0-kube-api-access-5t4s9\") pod \"kube-storage-version-migrator-operator-b67b599dd-2xfms\" (UID: \"b6a6001a-7b27-4dc4-a1c5-5bbaa449dfe0\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-2xfms" Feb 24 10:12:13 crc kubenswrapper[4985]: I0224 10:12:13.128480 4985 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/45e61914-7807-43c3-b80b-a5b355b7447e-profile-collector-cert\") pod \"catalog-operator-68c6474976-2vcf4\" (UID: \"45e61914-7807-43c3-b80b-a5b355b7447e\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-2vcf4" Feb 24 10:12:13 crc kubenswrapper[4985]: I0224 10:12:13.130721 4985 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-ktpsr"] Feb 24 10:12:13 crc kubenswrapper[4985]: I0224 10:12:13.157350 4985 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/deefacd4-ae02-434c-bb74-2ac6b217e0e2-bound-sa-token\") pod \"ingress-operator-5b745b69d9-lwqzt\" (UID: \"deefacd4-ae02-434c-bb74-2ac6b217e0e2\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-lwqzt" Feb 24 10:12:13 crc kubenswrapper[4985]: I0224 10:12:13.194521 4985 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/ea70e899-59ad-4ef2-b407-81c67de39e50-node-bootstrap-token\") pod \"machine-config-server-6ssps\" (UID: \"ea70e899-59ad-4ef2-b407-81c67de39e50\") " pod="openshift-machine-config-operator/machine-config-server-6ssps" Feb 24 10:12:13 crc kubenswrapper[4985]: I0224 10:12:13.194607 4985 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/7b4ae5a0-ee86-4518-b754-4b57da2dc152-socket-dir\") pod \"csi-hostpathplugin-n5mkh\" (UID: \"7b4ae5a0-ee86-4518-b754-4b57da2dc152\") " pod="hostpath-provisioner/csi-hostpathplugin-n5mkh" Feb 24 10:12:13 crc kubenswrapper[4985]: I0224 10:12:13.194638 4985 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x8rw2\" (UniqueName: \"kubernetes.io/projected/1ef1d68d-0ae5-4c63-b7b3-40e39ad30a87-kube-api-access-x8rw2\") pod \"olm-operator-6b444d44fb-mvvxj\" (UID: \"1ef1d68d-0ae5-4c63-b7b3-40e39ad30a87\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-mvvxj" Feb 24 10:12:13 crc kubenswrapper[4985]: I0224 10:12:13.194662 4985 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n5g96\" (UniqueName: \"kubernetes.io/projected/ea70e899-59ad-4ef2-b407-81c67de39e50-kube-api-access-n5g96\") pod \"machine-config-server-6ssps\" (UID: \"ea70e899-59ad-4ef2-b407-81c67de39e50\") " pod="openshift-machine-config-operator/machine-config-server-6ssps" Feb 24 10:12:13 crc kubenswrapper[4985]: I0224 10:12:13.194868 4985 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/7b4ae5a0-ee86-4518-b754-4b57da2dc152-socket-dir\") pod \"csi-hostpathplugin-n5mkh\" (UID: \"7b4ae5a0-ee86-4518-b754-4b57da2dc152\") " pod="hostpath-provisioner/csi-hostpathplugin-n5mkh" Feb 24 10:12:13 crc kubenswrapper[4985]: I0224 10:12:13.202828 4985 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gphtn\" (UniqueName: \"kubernetes.io/projected/45e61914-7807-43c3-b80b-a5b355b7447e-kube-api-access-gphtn\") pod \"catalog-operator-68c6474976-2vcf4\" (UID: \"45e61914-7807-43c3-b80b-a5b355b7447e\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-2vcf4" Feb 24 10:12:13 crc kubenswrapper[4985]: I0224 10:12:13.203741 4985 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-2xfms" Feb 24 10:12:13 crc kubenswrapper[4985]: I0224 10:12:13.210321 4985 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/ea70e899-59ad-4ef2-b407-81c67de39e50-node-bootstrap-token\") pod \"machine-config-server-6ssps\" (UID: \"ea70e899-59ad-4ef2-b407-81c67de39e50\") " pod="openshift-machine-config-operator/machine-config-server-6ssps" Feb 24 10:12:13 crc kubenswrapper[4985]: I0224 10:12:13.215105 4985 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/ea70e899-59ad-4ef2-b407-81c67de39e50-certs\") pod \"machine-config-server-6ssps\" (UID: \"ea70e899-59ad-4ef2-b407-81c67de39e50\") " pod="openshift-machine-config-operator/machine-config-server-6ssps" Feb 24 10:12:13 crc kubenswrapper[4985]: I0224 10:12:13.215195 4985 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rw2gs\" (UniqueName: \"kubernetes.io/projected/7b4ae5a0-ee86-4518-b754-4b57da2dc152-kube-api-access-rw2gs\") pod \"csi-hostpathplugin-n5mkh\" (UID: \"7b4ae5a0-ee86-4518-b754-4b57da2dc152\") " pod="hostpath-provisioner/csi-hostpathplugin-n5mkh" Feb 24 10:12:13 crc kubenswrapper[4985]: I0224 10:12:13.215247 4985 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ggvx6\" (UID: \"6d51aec2-2b30-49d0-8aea-019bea882940\") " pod="openshift-image-registry/image-registry-697d97f7c8-ggvx6" Feb 24 10:12:13 crc kubenswrapper[4985]: I0224 10:12:13.215294 4985 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/7b4ae5a0-ee86-4518-b754-4b57da2dc152-csi-data-dir\") pod \"csi-hostpathplugin-n5mkh\" (UID: \"7b4ae5a0-ee86-4518-b754-4b57da2dc152\") " pod="hostpath-provisioner/csi-hostpathplugin-n5mkh" Feb 24 10:12:13 crc kubenswrapper[4985]: E0224 10:12:13.215715 4985 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-24 10:12:13.715694595 +0000 UTC m=+218.189887155 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ggvx6" (UID: "6d51aec2-2b30-49d0-8aea-019bea882940") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 24 10:12:13 crc kubenswrapper[4985]: I0224 10:12:13.215868 4985 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/7b4ae5a0-ee86-4518-b754-4b57da2dc152-csi-data-dir\") pod \"csi-hostpathplugin-n5mkh\" (UID: \"7b4ae5a0-ee86-4518-b754-4b57da2dc152\") " pod="hostpath-provisioner/csi-hostpathplugin-n5mkh" Feb 24 10:12:13 crc kubenswrapper[4985]: I0224 10:12:13.225950 4985 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/secret/ea70e899-59ad-4ef2-b407-81c67de39e50-certs\") pod \"machine-config-server-6ssps\" (UID: \"ea70e899-59ad-4ef2-b407-81c67de39e50\") " pod="openshift-machine-config-operator/machine-config-server-6ssps" Feb 24 10:12:13 crc kubenswrapper[4985]: I0224 10:12:13.227062 4985 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rsmd2\" (UniqueName: \"kubernetes.io/projected/db31f9f1-19fc-47b3-8570-5428789afce0-kube-api-access-rsmd2\") pod \"multus-admission-controller-857f4d67dd-mlncn\" (UID: \"db31f9f1-19fc-47b3-8570-5428789afce0\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-mlncn" Feb 24 10:12:13 crc kubenswrapper[4985]: I0224 10:12:13.235563 4985 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-mlncn" Feb 24 10:12:13 crc kubenswrapper[4985]: I0224 10:12:13.251939 4985 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wktk8\" (UniqueName: \"kubernetes.io/projected/139ca41b-7c70-4db2-a9bf-a5495fdeadbe-kube-api-access-wktk8\") pod \"service-ca-operator-777779d784-225jm\" (UID: \"139ca41b-7c70-4db2-a9bf-a5495fdeadbe\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-225jm" Feb 24 10:12:13 crc kubenswrapper[4985]: I0224 10:12:13.252574 4985 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-2vcf4" Feb 24 10:12:13 crc kubenswrapper[4985]: I0224 10:12:13.255670 4985 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-mvvxj" Feb 24 10:12:13 crc kubenswrapper[4985]: I0224 10:12:13.268558 4985 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-cfp92" Feb 24 10:12:13 crc kubenswrapper[4985]: I0224 10:12:13.283415 4985 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-225jm" Feb 24 10:12:13 crc kubenswrapper[4985]: I0224 10:12:13.287640 4985 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-t8w87"] Feb 24 10:12:13 crc kubenswrapper[4985]: I0224 10:12:13.295244 4985 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h9h7h\" (UniqueName: \"kubernetes.io/projected/122aaa41-07ff-451d-a71f-b343d6993525-kube-api-access-h9h7h\") pod \"machine-config-operator-74547568cd-pswjd\" (UID: \"122aaa41-07ff-451d-a71f-b343d6993525\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-pswjd" Feb 24 10:12:13 crc kubenswrapper[4985]: I0224 10:12:13.296986 4985 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/da44f74b-f631-45cb-a8ad-78e36e49d2f1-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-4hthm\" (UID: \"da44f74b-f631-45cb-a8ad-78e36e49d2f1\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-4hthm" Feb 24 10:12:13 crc kubenswrapper[4985]: I0224 10:12:13.316696 4985 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 24 10:12:13 crc kubenswrapper[4985]: I0224 10:12:13.316999 4985 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9jp65\" (UniqueName: \"kubernetes.io/projected/ad42793f-f18e-4948-95fe-84c7b2edcf9e-kube-api-access-9jp65\") pod \"service-ca-9c57cc56f-txrfz\" (UID: \"ad42793f-f18e-4948-95fe-84c7b2edcf9e\") " pod="openshift-service-ca/service-ca-9c57cc56f-txrfz" Feb 24 10:12:13 crc kubenswrapper[4985]: I0224 10:12:13.319235 4985 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8wmsl\" (UniqueName: \"kubernetes.io/projected/6ef771ef-28ac-46c6-925e-f12a7a70b6c3-kube-api-access-8wmsl\") pod \"marketplace-operator-79b997595-rlq9j\" (UID: \"6ef771ef-28ac-46c6-925e-f12a7a70b6c3\") " pod="openshift-marketplace/marketplace-operator-79b997595-rlq9j" Feb 24 10:12:13 crc kubenswrapper[4985]: E0224 10:12:13.319496 4985 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-24 10:12:13.819471659 +0000 UTC m=+218.293664209 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 24 10:12:13 crc kubenswrapper[4985]: I0224 10:12:13.321361 4985 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-zjjx8"] Feb 24 10:12:13 crc kubenswrapper[4985]: I0224 10:12:13.331218 4985 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-lwqzt" Feb 24 10:12:13 crc kubenswrapper[4985]: I0224 10:12:13.336546 4985 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c9w9s\" (UniqueName: \"kubernetes.io/projected/4389db90-c320-4227-b4cc-efc14354ac37-kube-api-access-c9w9s\") pod \"package-server-manager-789f6589d5-hfkq2\" (UID: \"4389db90-c320-4227-b4cc-efc14354ac37\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-hfkq2" Feb 24 10:12:13 crc kubenswrapper[4985]: I0224 10:12:13.343130 4985 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-2hv9t" Feb 24 10:12:13 crc kubenswrapper[4985]: I0224 10:12:13.358698 4985 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/b85b9142-2d6e-47e9-8d88-bed8bbaa6df7-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-25d4p\" (UID: \"b85b9142-2d6e-47e9-8d88-bed8bbaa6df7\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-25d4p" Feb 24 10:12:13 crc kubenswrapper[4985]: I0224 10:12:13.370889 4985 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9rbb7\" (UniqueName: \"kubernetes.io/projected/492804d6-fccf-4710-b64f-aef85fa80b2d-kube-api-access-9rbb7\") pod \"dns-default-w5s2c\" (UID: \"492804d6-fccf-4710-b64f-aef85fa80b2d\") " pod="openshift-dns/dns-default-w5s2c" Feb 24 10:12:13 crc kubenswrapper[4985]: I0224 10:12:13.378344 4985 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-4hthm" Feb 24 10:12:13 crc kubenswrapper[4985]: I0224 10:12:13.396949 4985 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-xh5q8" event={"ID":"ee57ec8e-3901-4355-b744-5ed2eeb20c9d","Type":"ContainerStarted","Data":"93977d6aab2f8e5fb39fed069172045bef4276f3311667c0c21484be2d8e881a"} Feb 24 10:12:13 crc kubenswrapper[4985]: I0224 10:12:13.400612 4985 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-558db77b4-xh5q8" Feb 24 10:12:13 crc kubenswrapper[4985]: I0224 10:12:13.407534 4985 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-ktpsr" event={"ID":"db4512d4-0e5a-467b-85da-2eb3addffa4f","Type":"ContainerStarted","Data":"f4fe252e8f15d61112773264f9803cd9e081a407cb4ac4ff3fa424128220245b"} Feb 24 10:12:13 crc kubenswrapper[4985]: I0224 10:12:13.420754 4985 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-t8w87" event={"ID":"751d5065-e3f3-4fe5-9d24-ca6c7197d0d6","Type":"ContainerStarted","Data":"cda8f6a161d9fa4ba7818856107ec4c1cea9e77b85e1445592149bfc2136d4d0"} Feb 24 10:12:13 crc kubenswrapper[4985]: I0224 10:12:13.422010 4985 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ggvx6\" (UID: \"6d51aec2-2b30-49d0-8aea-019bea882940\") " pod="openshift-image-registry/image-registry-697d97f7c8-ggvx6" Feb 24 10:12:13 crc kubenswrapper[4985]: E0224 10:12:13.422594 4985 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-24 10:12:13.922578772 +0000 UTC m=+218.396771332 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ggvx6" (UID: "6d51aec2-2b30-49d0-8aea-019bea882940") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 24 10:12:13 crc kubenswrapper[4985]: I0224 10:12:13.422672 4985 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-prkmg"] Feb 24 10:12:13 crc kubenswrapper[4985]: I0224 10:12:13.423422 4985 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f9zlg\" (UniqueName: \"kubernetes.io/projected/34aea390-2b22-43e4-aabd-8c3d13390620-kube-api-access-f9zlg\") pod \"machine-config-controller-84d6567774-ww7q7\" (UID: \"34aea390-2b22-43e4-aabd-8c3d13390620\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-ww7q7" Feb 24 10:12:13 crc kubenswrapper[4985]: I0224 10:12:13.423828 4985 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rmcsd\" (UniqueName: \"kubernetes.io/projected/2364f27b-fc75-4e56-8122-d6bdcc763b0a-kube-api-access-rmcsd\") pod \"collect-profiles-29532120-q5v4c\" (UID: \"2364f27b-fc75-4e56-8122-d6bdcc763b0a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29532120-q5v4c" Feb 24 10:12:13 crc kubenswrapper[4985]: I0224 10:12:13.436537 4985 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mdlkz\" (UniqueName: \"kubernetes.io/projected/b85b9142-2d6e-47e9-8d88-bed8bbaa6df7-kube-api-access-mdlkz\") pod \"cluster-image-registry-operator-dc59b4c8b-25d4p\" (UID: \"b85b9142-2d6e-47e9-8d88-bed8bbaa6df7\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-25d4p" Feb 24 10:12:13 crc kubenswrapper[4985]: I0224 10:12:13.440475 4985 generic.go:334] "Generic (PLEG): container finished" podID="5dba6bb5-c7aa-45b0-82b4-990febac8e99" containerID="8ecb3b6ea324b995173bce0c0a2a802c1d5f8f8270376a3a08b91bf8fecb8a65" exitCode=0 Feb 24 10:12:13 crc kubenswrapper[4985]: I0224 10:12:13.440573 4985 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-n8mff" event={"ID":"5dba6bb5-c7aa-45b0-82b4-990febac8e99","Type":"ContainerDied","Data":"8ecb3b6ea324b995173bce0c0a2a802c1d5f8f8270376a3a08b91bf8fecb8a65"} Feb 24 10:12:13 crc kubenswrapper[4985]: I0224 10:12:13.440615 4985 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-n8mff" event={"ID":"5dba6bb5-c7aa-45b0-82b4-990febac8e99","Type":"ContainerStarted","Data":"3c23e7dce6275cb2962eacabf1b79255f999f121e919a842a3948f5cf0be8350"} Feb 24 10:12:13 crc kubenswrapper[4985]: I0224 10:12:13.442793 4985 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-cs28d" event={"ID":"175055a6-4d1f-47c1-8947-287934749c2b","Type":"ContainerStarted","Data":"fb2a5169ad0e7c580e67cd2e2ae28017b522d93a28b9e481faf3ac33985cf7bd"} Feb 24 10:12:13 crc kubenswrapper[4985]: I0224 10:12:13.442830 4985 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-cs28d" event={"ID":"175055a6-4d1f-47c1-8947-287934749c2b","Type":"ContainerStarted","Data":"8d65fe77720686c4911a9cd05c51b29ae8698543ac312976668ca8efd0103e36"} Feb 24 10:12:13 crc kubenswrapper[4985]: I0224 10:12:13.449221 4985 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-jlxbx" event={"ID":"d93e05b9-4ed8-4cd6-a961-25cf43ed4cf9","Type":"ContainerStarted","Data":"ed78f44d6bdbbdb7944a5525fe38db65d0e4234d0ce3bc7494aac933a717a9ba"} Feb 24 10:12:13 crc kubenswrapper[4985]: I0224 10:12:13.449984 4985 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29532120-q5v4c" Feb 24 10:12:13 crc kubenswrapper[4985]: I0224 10:12:13.452101 4985 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-gcwfl" event={"ID":"4cc3b979-d591-4106-8874-760925fd10f6","Type":"ContainerStarted","Data":"213ce9add2be72e6348dc12e71d4e2c95d65bdb9721725a913a113fe5996aa90"} Feb 24 10:12:13 crc kubenswrapper[4985]: I0224 10:12:13.452138 4985 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-gcwfl" event={"ID":"4cc3b979-d591-4106-8874-760925fd10f6","Type":"ContainerStarted","Data":"d5ea524fec1f1bf18c086ced24cbbc1fa537f0d5db84c8576a042ae2cb7f0a76"} Feb 24 10:12:13 crc kubenswrapper[4985]: I0224 10:12:13.454224 4985 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-jhlml" event={"ID":"cab7ac62-05fe-4c59-90f8-7ca1297d2cd0","Type":"ContainerStarted","Data":"a05d8d9081a5b857e560723b393689ffd3e4dcd3139028d2b1b2d4e7b1bb72eb"} Feb 24 10:12:13 crc kubenswrapper[4985]: I0224 10:12:13.455544 4985 patch_prober.go:28] interesting pod/console-operator-58897d9998-v8tln container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.10:8443/readyz\": dial tcp 10.217.0.10:8443: connect: connection refused" start-of-body= Feb 24 10:12:13 crc kubenswrapper[4985]: I0224 10:12:13.455582 4985 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-v8tln" podUID="8d200ef1-f292-460a-a971-1ac47d688fb1" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.10:8443/readyz\": dial tcp 10.217.0.10:8443: connect: connection refused" Feb 24 10:12:13 crc kubenswrapper[4985]: I0224 10:12:13.462044 4985 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-ww7q7" Feb 24 10:12:13 crc kubenswrapper[4985]: I0224 10:12:13.468231 4985 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n5g96\" (UniqueName: \"kubernetes.io/projected/ea70e899-59ad-4ef2-b407-81c67de39e50-kube-api-access-n5g96\") pod \"machine-config-server-6ssps\" (UID: \"ea70e899-59ad-4ef2-b407-81c67de39e50\") " pod="openshift-machine-config-operator/machine-config-server-6ssps" Feb 24 10:12:13 crc kubenswrapper[4985]: I0224 10:12:13.469829 4985 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-pswjd" Feb 24 10:12:13 crc kubenswrapper[4985]: W0224 10:12:13.483336 4985 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2dc233e3_9d26_4ec2_957f_350400af638e.slice/crio-41bd3c77677ebc1025842c5478e266c3551b4fef4755224e93a76d85d9529bc9 WatchSource:0}: Error finding container 41bd3c77677ebc1025842c5478e266c3551b4fef4755224e93a76d85d9529bc9: Status 404 returned error can't find the container with id 41bd3c77677ebc1025842c5478e266c3551b4fef4755224e93a76d85d9529bc9 Feb 24 10:12:13 crc kubenswrapper[4985]: I0224 10:12:13.486330 4985 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-rlq9j" Feb 24 10:12:13 crc kubenswrapper[4985]: I0224 10:12:13.499183 4985 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-w5s2c" Feb 24 10:12:13 crc kubenswrapper[4985]: I0224 10:12:13.504492 4985 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rw2gs\" (UniqueName: \"kubernetes.io/projected/7b4ae5a0-ee86-4518-b754-4b57da2dc152-kube-api-access-rw2gs\") pod \"csi-hostpathplugin-n5mkh\" (UID: \"7b4ae5a0-ee86-4518-b754-4b57da2dc152\") " pod="hostpath-provisioner/csi-hostpathplugin-n5mkh" Feb 24 10:12:13 crc kubenswrapper[4985]: I0224 10:12:13.523218 4985 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 24 10:12:13 crc kubenswrapper[4985]: E0224 10:12:13.524803 4985 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-24 10:12:14.024769888 +0000 UTC m=+218.498962448 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 24 10:12:13 crc kubenswrapper[4985]: I0224 10:12:13.531597 4985 ???:1] "http: TLS handshake error from 192.168.126.11:37266: no serving certificate available for the kubelet" Feb 24 10:12:13 crc kubenswrapper[4985]: I0224 10:12:13.540554 4985 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-n5mkh" Feb 24 10:12:13 crc kubenswrapper[4985]: I0224 10:12:13.543678 4985 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-6ssps" Feb 24 10:12:13 crc kubenswrapper[4985]: I0224 10:12:13.574281 4985 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-hfkq2" Feb 24 10:12:13 crc kubenswrapper[4985]: I0224 10:12:13.592931 4985 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-txrfz" Feb 24 10:12:13 crc kubenswrapper[4985]: I0224 10:12:13.596350 4985 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-qhzl5"] Feb 24 10:12:13 crc kubenswrapper[4985]: I0224 10:12:13.628775 4985 ???:1] "http: TLS handshake error from 192.168.126.11:37268: no serving certificate available for the kubelet" Feb 24 10:12:13 crc kubenswrapper[4985]: I0224 10:12:13.629782 4985 patch_prober.go:28] interesting pod/machine-config-daemon-hq52w container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 24 10:12:13 crc kubenswrapper[4985]: I0224 10:12:13.629837 4985 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hq52w" podUID="11c1c7b8-18df-4583-849f-76b62544344b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 24 10:12:13 crc kubenswrapper[4985]: I0224 10:12:13.661774 4985 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ggvx6\" (UID: \"6d51aec2-2b30-49d0-8aea-019bea882940\") " pod="openshift-image-registry/image-registry-697d97f7c8-ggvx6" Feb 24 10:12:13 crc kubenswrapper[4985]: E0224 10:12:13.668526 4985 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-24 10:12:14.168461853 +0000 UTC m=+218.642654423 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ggvx6" (UID: "6d51aec2-2b30-49d0-8aea-019bea882940") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 24 10:12:13 crc kubenswrapper[4985]: I0224 10:12:13.668790 4985 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-25d4p" Feb 24 10:12:13 crc kubenswrapper[4985]: I0224 10:12:13.689122 4985 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-27dpt" Feb 24 10:12:13 crc kubenswrapper[4985]: I0224 10:12:13.715720 4985 ???:1] "http: TLS handshake error from 192.168.126.11:37282: no serving certificate available for the kubelet" Feb 24 10:12:13 crc kubenswrapper[4985]: I0224 10:12:13.736372 4985 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-5444994796-gcwfl" Feb 24 10:12:13 crc kubenswrapper[4985]: I0224 10:12:13.737936 4985 patch_prober.go:28] interesting pod/router-default-5444994796-gcwfl container/router namespace/openshift-ingress: Startup probe status=failure output="Get \"http://localhost:1936/healthz/ready\": dial tcp [::1]:1936: connect: connection refused" start-of-body= Feb 24 10:12:13 crc kubenswrapper[4985]: I0224 10:12:13.737982 4985 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-gcwfl" podUID="4cc3b979-d591-4106-8874-760925fd10f6" containerName="router" probeResult="failure" output="Get \"http://localhost:1936/healthz/ready\": dial tcp [::1]:1936: connect: connection refused" Feb 24 10:12:13 crc kubenswrapper[4985]: I0224 10:12:13.749233 4985 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-2hxbt"] Feb 24 10:12:13 crc kubenswrapper[4985]: I0224 10:12:13.751872 4985 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-rmswq"] Feb 24 10:12:13 crc kubenswrapper[4985]: I0224 10:12:13.792215 4985 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-2xfms"] Feb 24 10:12:13 crc kubenswrapper[4985]: I0224 10:12:13.808243 4985 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 24 10:12:13 crc kubenswrapper[4985]: E0224 10:12:13.809052 4985 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-24 10:12:14.309029012 +0000 UTC m=+218.783221572 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 24 10:12:13 crc kubenswrapper[4985]: I0224 10:12:13.817275 4985 ???:1] "http: TLS handshake error from 192.168.126.11:37298: no serving certificate available for the kubelet" Feb 24 10:12:13 crc kubenswrapper[4985]: I0224 10:12:13.895110 4985 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-cs28d" podStartSLOduration=171.895089235 podStartE2EDuration="2m51.895089235s" podCreationTimestamp="2026-02-24 10:09:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-24 10:12:13.852190222 +0000 UTC m=+218.326382782" watchObservedRunningTime="2026-02-24 10:12:13.895089235 +0000 UTC m=+218.369281795" Feb 24 10:12:13 crc kubenswrapper[4985]: I0224 10:12:13.897016 4985 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-mlncn"] Feb 24 10:12:13 crc kubenswrapper[4985]: I0224 10:12:13.910397 4985 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ggvx6\" (UID: \"6d51aec2-2b30-49d0-8aea-019bea882940\") " pod="openshift-image-registry/image-registry-697d97f7c8-ggvx6" Feb 24 10:12:13 crc kubenswrapper[4985]: E0224 10:12:13.910862 4985 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-24 10:12:14.410848706 +0000 UTC m=+218.885041266 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ggvx6" (UID: "6d51aec2-2b30-49d0-8aea-019bea882940") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 24 10:12:13 crc kubenswrapper[4985]: I0224 10:12:13.920073 4985 ???:1] "http: TLS handshake error from 192.168.126.11:37306: no serving certificate available for the kubelet" Feb 24 10:12:13 crc kubenswrapper[4985]: W0224 10:12:13.920881 4985 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9debae0d_c342_4558_8796_bc9798c297de.slice/crio-00112e458d8383824b6d8270ceb184af0ed8b9ef2b761add5e1d4d45ef8e94d9 WatchSource:0}: Error finding container 00112e458d8383824b6d8270ceb184af0ed8b9ef2b761add5e1d4d45ef8e94d9: Status 404 returned error can't find the container with id 00112e458d8383824b6d8270ceb184af0ed8b9ef2b761add5e1d4d45ef8e94d9 Feb 24 10:12:13 crc kubenswrapper[4985]: W0224 10:12:13.962263 4985 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb6a6001a_7b27_4dc4_a1c5_5bbaa449dfe0.slice/crio-aa958926f91fe7a20c2ee2200cb22e58ba65cd3f5b5e17959826d53297818d77 WatchSource:0}: Error finding container aa958926f91fe7a20c2ee2200cb22e58ba65cd3f5b5e17959826d53297818d77: Status 404 returned error can't find the container with id aa958926f91fe7a20c2ee2200cb22e58ba65cd3f5b5e17959826d53297818d77 Feb 24 10:12:13 crc kubenswrapper[4985]: I0224 10:12:13.975925 4985 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-2vcf4"] Feb 24 10:12:14 crc kubenswrapper[4985]: I0224 10:12:14.011564 4985 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 24 10:12:14 crc kubenswrapper[4985]: E0224 10:12:14.011944 4985 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-24 10:12:14.511926978 +0000 UTC m=+218.986119538 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 24 10:12:14 crc kubenswrapper[4985]: I0224 10:12:14.018261 4985 ???:1] "http: TLS handshake error from 192.168.126.11:37318: no serving certificate available for the kubelet" Feb 24 10:12:14 crc kubenswrapper[4985]: I0224 10:12:14.020772 4985 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-58897d9998-v8tln" podStartSLOduration=172.020751938 podStartE2EDuration="2m52.020751938s" podCreationTimestamp="2026-02-24 10:09:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-24 10:12:14.018661614 +0000 UTC m=+218.492854174" watchObservedRunningTime="2026-02-24 10:12:14.020751938 +0000 UTC m=+218.494944498" Feb 24 10:12:14 crc kubenswrapper[4985]: I0224 10:12:14.113260 4985 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ggvx6\" (UID: \"6d51aec2-2b30-49d0-8aea-019bea882940\") " pod="openshift-image-registry/image-registry-697d97f7c8-ggvx6" Feb 24 10:12:14 crc kubenswrapper[4985]: E0224 10:12:14.113646 4985 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-24 10:12:14.613635479 +0000 UTC m=+219.087828039 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ggvx6" (UID: "6d51aec2-2b30-49d0-8aea-019bea882940") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 24 10:12:14 crc kubenswrapper[4985]: I0224 10:12:14.124156 4985 ???:1] "http: TLS handshake error from 192.168.126.11:37332: no serving certificate available for the kubelet" Feb 24 10:12:14 crc kubenswrapper[4985]: I0224 10:12:14.136299 4985 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-879f6c89f-nncfz" podStartSLOduration=172.136284421 podStartE2EDuration="2m52.136284421s" podCreationTimestamp="2026-02-24 10:09:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-24 10:12:14.135820238 +0000 UTC m=+218.610012798" watchObservedRunningTime="2026-02-24 10:12:14.136284421 +0000 UTC m=+218.610476971" Feb 24 10:12:14 crc kubenswrapper[4985]: I0224 10:12:14.137505 4985 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication-operator/authentication-operator-69f744f599-bl6z5" podStartSLOduration=172.137498639 podStartE2EDuration="2m52.137498639s" podCreationTimestamp="2026-02-24 10:09:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-24 10:12:14.090231343 +0000 UTC m=+218.564423913" watchObservedRunningTime="2026-02-24 10:12:14.137498639 +0000 UTC m=+218.611691199" Feb 24 10:12:14 crc kubenswrapper[4985]: I0224 10:12:14.218965 4985 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 24 10:12:14 crc kubenswrapper[4985]: E0224 10:12:14.219435 4985 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-24 10:12:14.719414774 +0000 UTC m=+219.193607334 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 24 10:12:14 crc kubenswrapper[4985]: I0224 10:12:14.230655 4985 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-5444994796-gcwfl" podStartSLOduration=172.230634887 podStartE2EDuration="2m52.230634887s" podCreationTimestamp="2026-02-24 10:09:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-24 10:12:14.222089235 +0000 UTC m=+218.696281805" watchObservedRunningTime="2026-02-24 10:12:14.230634887 +0000 UTC m=+218.704827447" Feb 24 10:12:14 crc kubenswrapper[4985]: I0224 10:12:14.252357 4985 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29532120-q5v4c"] Feb 24 10:12:14 crc kubenswrapper[4985]: I0224 10:12:14.255981 4985 ???:1] "http: TLS handshake error from 192.168.126.11:37348: no serving certificate available for the kubelet" Feb 24 10:12:14 crc kubenswrapper[4985]: I0224 10:12:14.320662 4985 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ggvx6\" (UID: \"6d51aec2-2b30-49d0-8aea-019bea882940\") " pod="openshift-image-registry/image-registry-697d97f7c8-ggvx6" Feb 24 10:12:14 crc kubenswrapper[4985]: E0224 10:12:14.321000 4985 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-24 10:12:14.820987701 +0000 UTC m=+219.295180261 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ggvx6" (UID: "6d51aec2-2b30-49d0-8aea-019bea882940") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 24 10:12:14 crc kubenswrapper[4985]: I0224 10:12:14.336469 4985 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-558db77b4-xh5q8" podStartSLOduration=172.336450934 podStartE2EDuration="2m52.336450934s" podCreationTimestamp="2026-02-24 10:09:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-24 10:12:14.335428432 +0000 UTC m=+218.809620992" watchObservedRunningTime="2026-02-24 10:12:14.336450934 +0000 UTC m=+218.810643494" Feb 24 10:12:14 crc kubenswrapper[4985]: I0224 10:12:14.362701 4985 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-t527v" podStartSLOduration=172.362681976 podStartE2EDuration="2m52.362681976s" podCreationTimestamp="2026-02-24 10:09:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-24 10:12:14.360890491 +0000 UTC m=+218.835083041" watchObservedRunningTime="2026-02-24 10:12:14.362681976 +0000 UTC m=+218.836874536" Feb 24 10:12:14 crc kubenswrapper[4985]: I0224 10:12:14.369953 4985 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-225jm"] Feb 24 10:12:14 crc kubenswrapper[4985]: I0224 10:12:14.405218 4985 patch_prober.go:28] interesting pod/oauth-openshift-558db77b4-xh5q8 container/oauth-openshift namespace/openshift-authentication: Readiness probe status=failure output="Get \"https://10.217.0.11:6443/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Feb 24 10:12:14 crc kubenswrapper[4985]: I0224 10:12:14.405273 4985 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-authentication/oauth-openshift-558db77b4-xh5q8" podUID="ee57ec8e-3901-4355-b744-5ed2eeb20c9d" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.11:6443/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Feb 24 10:12:14 crc kubenswrapper[4985]: I0224 10:12:14.421767 4985 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 24 10:12:14 crc kubenswrapper[4985]: E0224 10:12:14.422229 4985 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-24 10:12:14.922214506 +0000 UTC m=+219.396407066 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 24 10:12:14 crc kubenswrapper[4985]: I0224 10:12:14.486413 4985 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-2xfms" event={"ID":"b6a6001a-7b27-4dc4-a1c5-5bbaa449dfe0","Type":"ContainerStarted","Data":"aa958926f91fe7a20c2ee2200cb22e58ba65cd3f5b5e17959826d53297818d77"} Feb 24 10:12:14 crc kubenswrapper[4985]: I0224 10:12:14.487977 4985 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-mvvxj"] Feb 24 10:12:14 crc kubenswrapper[4985]: I0224 10:12:14.496848 4985 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-2hxbt" event={"ID":"9debae0d-c342-4558-8796-bc9798c297de","Type":"ContainerStarted","Data":"00112e458d8383824b6d8270ceb184af0ed8b9ef2b761add5e1d4d45ef8e94d9"} Feb 24 10:12:14 crc kubenswrapper[4985]: I0224 10:12:14.497831 4985 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-prkmg" event={"ID":"2dc233e3-9d26-4ec2-957f-350400af638e","Type":"ContainerStarted","Data":"41bd3c77677ebc1025842c5478e266c3551b4fef4755224e93a76d85d9529bc9"} Feb 24 10:12:14 crc kubenswrapper[4985]: I0224 10:12:14.499459 4985 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-jhlml" event={"ID":"cab7ac62-05fe-4c59-90f8-7ca1297d2cd0","Type":"ContainerStarted","Data":"f55dc6478b79cbe01cc3313999c1d1e797b9800d19ef49eac64cf0d46f3e3401"} Feb 24 10:12:14 crc kubenswrapper[4985]: I0224 10:12:14.502944 4985 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-mlncn" event={"ID":"db31f9f1-19fc-47b3-8570-5428789afce0","Type":"ContainerStarted","Data":"7ad8771ee8a46599a20006c0a8894cc0dda9b3db4766eb25b5da96ce1e55dbb5"} Feb 24 10:12:14 crc kubenswrapper[4985]: I0224 10:12:14.510140 4985 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-rmswq" event={"ID":"e3da134f-45d4-4d12-bc38-87546e0920cd","Type":"ContainerStarted","Data":"4e9e038580e1b790de62ff29eab46d52c355d631966cb07b4a63a7814c108769"} Feb 24 10:12:14 crc kubenswrapper[4985]: I0224 10:12:14.522950 4985 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ggvx6\" (UID: \"6d51aec2-2b30-49d0-8aea-019bea882940\") " pod="openshift-image-registry/image-registry-697d97f7c8-ggvx6" Feb 24 10:12:14 crc kubenswrapper[4985]: E0224 10:12:14.523275 4985 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-24 10:12:15.023260737 +0000 UTC m=+219.497453297 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ggvx6" (UID: "6d51aec2-2b30-49d0-8aea-019bea882940") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 24 10:12:14 crc kubenswrapper[4985]: I0224 10:12:14.523701 4985 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-2vcf4" event={"ID":"45e61914-7807-43c3-b80b-a5b355b7447e","Type":"ContainerStarted","Data":"7908bdc3a9f104d23ef77d64a0d64d58c405e9b87411f5bfcdf97bbf05b42cc6"} Feb 24 10:12:14 crc kubenswrapper[4985]: I0224 10:12:14.546977 4985 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-qhzl5" event={"ID":"78d97920-f891-4f3a-9ccc-b5c10a64e22b","Type":"ContainerStarted","Data":"9e100bdb0f9f33a6ea591562f0c9dae847a8a5099213ad51f0f08df12c9af69d"} Feb 24 10:12:14 crc kubenswrapper[4985]: I0224 10:12:14.563423 4985 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-rlq9j"] Feb 24 10:12:14 crc kubenswrapper[4985]: I0224 10:12:14.591958 4985 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-cfp92"] Feb 24 10:12:14 crc kubenswrapper[4985]: I0224 10:12:14.618690 4985 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-nnr7f" event={"ID":"8905bada-0871-41b5-9a21-83e4d3884edf","Type":"ContainerStarted","Data":"b4d910be0205d33c1051b9171c21f82708c2d1f28379c24b4b43ffaea0aef377"} Feb 24 10:12:14 crc kubenswrapper[4985]: W0224 10:12:14.621536 4985 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod139ca41b_7c70_4db2_a9bf_a5495fdeadbe.slice/crio-c3f4f8a23cac129a2c332e4f008a333a22b17b2a0bf3b157a564cf18a871ad02 WatchSource:0}: Error finding container c3f4f8a23cac129a2c332e4f008a333a22b17b2a0bf3b157a564cf18a871ad02: Status 404 returned error can't find the container with id c3f4f8a23cac129a2c332e4f008a333a22b17b2a0bf3b157a564cf18a871ad02 Feb 24 10:12:14 crc kubenswrapper[4985]: I0224 10:12:14.623958 4985 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 24 10:12:14 crc kubenswrapper[4985]: E0224 10:12:14.624326 4985 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-24 10:12:15.124307698 +0000 UTC m=+219.598500258 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 24 10:12:14 crc kubenswrapper[4985]: I0224 10:12:14.656884 4985 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-ktpsr" event={"ID":"db4512d4-0e5a-467b-85da-2eb3addffa4f","Type":"ContainerStarted","Data":"8e453834a98ef39d37cd0fa35a6843bb76b101b72422b8a23b1af6bcc5a7c679"} Feb 24 10:12:14 crc kubenswrapper[4985]: I0224 10:12:14.665336 4985 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-f9d7485db-d7v4s" podStartSLOduration=172.665309482 podStartE2EDuration="2m52.665309482s" podCreationTimestamp="2026-02-24 10:09:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-24 10:12:14.632179048 +0000 UTC m=+219.106371608" watchObservedRunningTime="2026-02-24 10:12:14.665309482 +0000 UTC m=+219.139502042" Feb 24 10:12:14 crc kubenswrapper[4985]: I0224 10:12:14.701649 4985 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-sbwld" podStartSLOduration=172.701631503 podStartE2EDuration="2m52.701631503s" podCreationTimestamp="2026-02-24 10:09:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-24 10:12:14.700965172 +0000 UTC m=+219.175157732" watchObservedRunningTime="2026-02-24 10:12:14.701631503 +0000 UTC m=+219.175824053" Feb 24 10:12:14 crc kubenswrapper[4985]: I0224 10:12:14.720760 4985 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-zjjx8" event={"ID":"bce7d3e8-b855-43ae-8527-fbc14ac50521","Type":"ContainerStarted","Data":"936efa276429e3ed3e00047c9db6382a65e2dacc8057c707828a61054cd95095"} Feb 24 10:12:14 crc kubenswrapper[4985]: I0224 10:12:14.729092 4985 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ggvx6\" (UID: \"6d51aec2-2b30-49d0-8aea-019bea882940\") " pod="openshift-image-registry/image-registry-697d97f7c8-ggvx6" Feb 24 10:12:14 crc kubenswrapper[4985]: E0224 10:12:14.729354 4985 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-24 10:12:15.229340171 +0000 UTC m=+219.703532731 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ggvx6" (UID: "6d51aec2-2b30-49d0-8aea-019bea882940") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 24 10:12:14 crc kubenswrapper[4985]: I0224 10:12:14.741446 4985 patch_prober.go:28] interesting pod/router-default-5444994796-gcwfl container/router namespace/openshift-ingress: Startup probe status=failure output="Get \"http://localhost:1936/healthz/ready\": dial tcp [::1]:1936: connect: connection refused" start-of-body= Feb 24 10:12:14 crc kubenswrapper[4985]: I0224 10:12:14.741484 4985 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-gcwfl" podUID="4cc3b979-d591-4106-8874-760925fd10f6" containerName="router" probeResult="failure" output="Get \"http://localhost:1936/healthz/ready\": dial tcp [::1]:1936: connect: connection refused" Feb 24 10:12:14 crc kubenswrapper[4985]: I0224 10:12:14.747179 4985 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-558db77b4-xh5q8" Feb 24 10:12:14 crc kubenswrapper[4985]: I0224 10:12:14.797214 4985 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-58897d9998-v8tln" Feb 24 10:12:14 crc kubenswrapper[4985]: I0224 10:12:14.844187 4985 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 24 10:12:14 crc kubenswrapper[4985]: E0224 10:12:14.844354 4985 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-24 10:12:15.344328888 +0000 UTC m=+219.818521448 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 24 10:12:14 crc kubenswrapper[4985]: I0224 10:12:14.845057 4985 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ggvx6\" (UID: \"6d51aec2-2b30-49d0-8aea-019bea882940\") " pod="openshift-image-registry/image-registry-697d97f7c8-ggvx6" Feb 24 10:12:14 crc kubenswrapper[4985]: E0224 10:12:14.847855 4985 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-24 10:12:15.347823664 +0000 UTC m=+219.822016224 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ggvx6" (UID: "6d51aec2-2b30-49d0-8aea-019bea882940") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 24 10:12:14 crc kubenswrapper[4985]: I0224 10:12:14.937787 4985 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-n5mkh"] Feb 24 10:12:14 crc kubenswrapper[4985]: I0224 10:12:14.946357 4985 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 24 10:12:14 crc kubenswrapper[4985]: E0224 10:12:14.946668 4985 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-24 10:12:15.446652807 +0000 UTC m=+219.920845367 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 24 10:12:14 crc kubenswrapper[4985]: I0224 10:12:14.973087 4985 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-lwqzt"] Feb 24 10:12:14 crc kubenswrapper[4985]: I0224 10:12:14.977997 4985 ???:1] "http: TLS handshake error from 192.168.126.11:52686: no serving certificate available for the kubelet" Feb 24 10:12:15 crc kubenswrapper[4985]: I0224 10:12:15.029869 4985 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-2hv9t"] Feb 24 10:12:15 crc kubenswrapper[4985]: I0224 10:12:15.048279 4985 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ggvx6\" (UID: \"6d51aec2-2b30-49d0-8aea-019bea882940\") " pod="openshift-image-registry/image-registry-697d97f7c8-ggvx6" Feb 24 10:12:15 crc kubenswrapper[4985]: E0224 10:12:15.048659 4985 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-24 10:12:15.548644927 +0000 UTC m=+220.022837487 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ggvx6" (UID: "6d51aec2-2b30-49d0-8aea-019bea882940") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 24 10:12:15 crc kubenswrapper[4985]: I0224 10:12:15.161256 4985 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 24 10:12:15 crc kubenswrapper[4985]: E0224 10:12:15.161898 4985 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-24 10:12:15.66187188 +0000 UTC m=+220.136064440 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 24 10:12:15 crc kubenswrapper[4985]: I0224 10:12:15.173665 4985 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/machine-api-operator-5694c8668f-t79l8" podStartSLOduration=173.17364638 podStartE2EDuration="2m53.17364638s" podCreationTimestamp="2026-02-24 10:09:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-24 10:12:15.14750045 +0000 UTC m=+219.621693000" watchObservedRunningTime="2026-02-24 10:12:15.17364638 +0000 UTC m=+219.647838940" Feb 24 10:12:15 crc kubenswrapper[4985]: I0224 10:12:15.176317 4985 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-pswjd"] Feb 24 10:12:15 crc kubenswrapper[4985]: I0224 10:12:15.270987 4985 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ggvx6\" (UID: \"6d51aec2-2b30-49d0-8aea-019bea882940\") " pod="openshift-image-registry/image-registry-697d97f7c8-ggvx6" Feb 24 10:12:15 crc kubenswrapper[4985]: E0224 10:12:15.280950 4985 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-24 10:12:15.771522013 +0000 UTC m=+220.245714573 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ggvx6" (UID: "6d51aec2-2b30-49d0-8aea-019bea882940") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 24 10:12:15 crc kubenswrapper[4985]: I0224 10:12:15.326391 4985 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-w5s2c"] Feb 24 10:12:15 crc kubenswrapper[4985]: I0224 10:12:15.328087 4985 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-25d4p"] Feb 24 10:12:15 crc kubenswrapper[4985]: I0224 10:12:15.347404 4985 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-jlxbx" podStartSLOduration=173.347384613 podStartE2EDuration="2m53.347384613s" podCreationTimestamp="2026-02-24 10:09:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-24 10:12:15.346410073 +0000 UTC m=+219.820602633" watchObservedRunningTime="2026-02-24 10:12:15.347384613 +0000 UTC m=+219.821577173" Feb 24 10:12:15 crc kubenswrapper[4985]: I0224 10:12:15.374296 4985 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 24 10:12:15 crc kubenswrapper[4985]: E0224 10:12:15.375326 4985 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-24 10:12:15.875289057 +0000 UTC m=+220.349481617 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 24 10:12:15 crc kubenswrapper[4985]: I0224 10:12:15.376305 4985 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-z6mz6" podStartSLOduration=173.376281458 podStartE2EDuration="2m53.376281458s" podCreationTimestamp="2026-02-24 10:09:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-24 10:12:15.372041398 +0000 UTC m=+219.846233968" watchObservedRunningTime="2026-02-24 10:12:15.376281458 +0000 UTC m=+219.850474018" Feb 24 10:12:15 crc kubenswrapper[4985]: I0224 10:12:15.447745 4985 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-nncfz"] Feb 24 10:12:15 crc kubenswrapper[4985]: I0224 10:12:15.475999 4985 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ggvx6\" (UID: \"6d51aec2-2b30-49d0-8aea-019bea882940\") " pod="openshift-image-registry/image-registry-697d97f7c8-ggvx6" Feb 24 10:12:15 crc kubenswrapper[4985]: E0224 10:12:15.476672 4985 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-24 10:12:15.976653587 +0000 UTC m=+220.450846147 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ggvx6" (UID: "6d51aec2-2b30-49d0-8aea-019bea882940") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 24 10:12:15 crc kubenswrapper[4985]: I0224 10:12:15.490196 4985 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-t527v"] Feb 24 10:12:15 crc kubenswrapper[4985]: I0224 10:12:15.490416 4985 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-t527v" podUID="20592ddc-38f3-4407-a5c3-b719b89cd42e" containerName="route-controller-manager" containerID="cri-o://e79df61ee8416172c627e78c1fe537792486444ac107125c0b8cdf78710fecf7" gracePeriod=30 Feb 24 10:12:15 crc kubenswrapper[4985]: I0224 10:12:15.498535 4985 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-ktpsr" podStartSLOduration=174.498501186 podStartE2EDuration="2m54.498501186s" podCreationTimestamp="2026-02-24 10:09:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-24 10:12:15.477656798 +0000 UTC m=+219.951849378" watchObservedRunningTime="2026-02-24 10:12:15.498501186 +0000 UTC m=+219.972693746" Feb 24 10:12:15 crc kubenswrapper[4985]: I0224 10:12:15.577670 4985 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 24 10:12:15 crc kubenswrapper[4985]: E0224 10:12:15.577988 4985 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-24 10:12:16.077972286 +0000 UTC m=+220.552164846 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 24 10:12:15 crc kubenswrapper[4985]: I0224 10:12:15.639537 4985 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-jlxbx" Feb 24 10:12:15 crc kubenswrapper[4985]: I0224 10:12:15.639987 4985 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-jlxbx" Feb 24 10:12:15 crc kubenswrapper[4985]: I0224 10:12:15.663842 4985 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-hfkq2"] Feb 24 10:12:15 crc kubenswrapper[4985]: I0224 10:12:15.667013 4985 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-4hthm"] Feb 24 10:12:15 crc kubenswrapper[4985]: I0224 10:12:15.689139 4985 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-jlxbx" Feb 24 10:12:15 crc kubenswrapper[4985]: I0224 10:12:15.689461 4985 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ggvx6\" (UID: \"6d51aec2-2b30-49d0-8aea-019bea882940\") " pod="openshift-image-registry/image-registry-697d97f7c8-ggvx6" Feb 24 10:12:15 crc kubenswrapper[4985]: E0224 10:12:15.689846 4985 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-24 10:12:16.189830328 +0000 UTC m=+220.664022888 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ggvx6" (UID: "6d51aec2-2b30-49d0-8aea-019bea882940") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 24 10:12:15 crc kubenswrapper[4985]: I0224 10:12:15.746937 4985 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-txrfz"] Feb 24 10:12:15 crc kubenswrapper[4985]: I0224 10:12:15.752288 4985 patch_prober.go:28] interesting pod/router-default-5444994796-gcwfl container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 24 10:12:15 crc kubenswrapper[4985]: [-]has-synced failed: reason withheld Feb 24 10:12:15 crc kubenswrapper[4985]: [+]process-running ok Feb 24 10:12:15 crc kubenswrapper[4985]: healthz check failed Feb 24 10:12:15 crc kubenswrapper[4985]: I0224 10:12:15.752411 4985 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-gcwfl" podUID="4cc3b979-d591-4106-8874-760925fd10f6" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 24 10:12:15 crc kubenswrapper[4985]: I0224 10:12:15.788857 4985 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29532120-q5v4c" event={"ID":"2364f27b-fc75-4e56-8122-d6bdcc763b0a","Type":"ContainerStarted","Data":"f60f137172f390d3b858db92e0e41ecf78bbfc35625bb45e4b106135474184df"} Feb 24 10:12:15 crc kubenswrapper[4985]: I0224 10:12:15.791888 4985 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-2hv9t" event={"ID":"c2ddf07b-fdd4-4d99-a09c-256ed526960f","Type":"ContainerStarted","Data":"62e207bb8bd69a529d6e61aba2dc3c1bba6827788822e72184900d15ea91cc23"} Feb 24 10:12:15 crc kubenswrapper[4985]: I0224 10:12:15.791968 4985 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 24 10:12:15 crc kubenswrapper[4985]: E0224 10:12:15.792296 4985 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-24 10:12:16.292273331 +0000 UTC m=+220.766465901 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 24 10:12:15 crc kubenswrapper[4985]: I0224 10:12:15.792571 4985 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ggvx6\" (UID: \"6d51aec2-2b30-49d0-8aea-019bea882940\") " pod="openshift-image-registry/image-registry-697d97f7c8-ggvx6" Feb 24 10:12:15 crc kubenswrapper[4985]: E0224 10:12:15.793308 4985 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-24 10:12:16.293293561 +0000 UTC m=+220.767486121 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ggvx6" (UID: "6d51aec2-2b30-49d0-8aea-019bea882940") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 24 10:12:15 crc kubenswrapper[4985]: I0224 10:12:15.800853 4985 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-ww7q7"] Feb 24 10:12:15 crc kubenswrapper[4985]: I0224 10:12:15.827051 4985 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-cfp92" event={"ID":"4d3661f1-4dcc-48a4-9129-7052c5a2d098","Type":"ContainerStarted","Data":"7a44ee6a99693c46a1b09683ee19cc46e7293080e884d7260f602e5cc7c535d7"} Feb 24 10:12:15 crc kubenswrapper[4985]: I0224 10:12:15.853375 4985 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-n5mkh" event={"ID":"7b4ae5a0-ee86-4518-b754-4b57da2dc152","Type":"ContainerStarted","Data":"b5b9f43cceed3a9526d44d5975443767088844e7e49b6d610fe1714a95f01657"} Feb 24 10:12:15 crc kubenswrapper[4985]: I0224 10:12:15.860136 4985 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-225jm" event={"ID":"139ca41b-7c70-4db2-a9bf-a5495fdeadbe","Type":"ContainerStarted","Data":"c3f4f8a23cac129a2c332e4f008a333a22b17b2a0bf3b157a564cf18a871ad02"} Feb 24 10:12:15 crc kubenswrapper[4985]: I0224 10:12:15.865494 4985 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-pswjd" event={"ID":"122aaa41-07ff-451d-a71f-b343d6993525","Type":"ContainerStarted","Data":"b4afdebea21cb0d7efbacf71ed3fd2cd5da37dfcf5a07fe63e352e2735b77fd6"} Feb 24 10:12:15 crc kubenswrapper[4985]: I0224 10:12:15.878420 4985 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-qhzl5" event={"ID":"78d97920-f891-4f3a-9ccc-b5c10a64e22b","Type":"ContainerStarted","Data":"28bb75e986758fd5c0e87d21b10e842704bd7ab5fe5b73375c6662db4531fdc5"} Feb 24 10:12:15 crc kubenswrapper[4985]: I0224 10:12:15.894197 4985 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 24 10:12:15 crc kubenswrapper[4985]: E0224 10:12:15.894561 4985 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-24 10:12:16.394543429 +0000 UTC m=+220.868735989 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 24 10:12:15 crc kubenswrapper[4985]: I0224 10:12:15.896853 4985 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-lwqzt" event={"ID":"deefacd4-ae02-434c-bb74-2ac6b217e0e2","Type":"ContainerStarted","Data":"77e369e6c72636adb84bb9da31e5ac19c0881ac283805a3c40a68a6d389c2860"} Feb 24 10:12:15 crc kubenswrapper[4985]: I0224 10:12:15.898829 4985 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-qhzl5" podStartSLOduration=173.898810709 podStartE2EDuration="2m53.898810709s" podCreationTimestamp="2026-02-24 10:09:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-24 10:12:15.898323684 +0000 UTC m=+220.372516244" watchObservedRunningTime="2026-02-24 10:12:15.898810709 +0000 UTC m=+220.373003269" Feb 24 10:12:15 crc kubenswrapper[4985]: I0224 10:12:15.920625 4985 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-t8w87" event={"ID":"751d5065-e3f3-4fe5-9d24-ca6c7197d0d6","Type":"ContainerStarted","Data":"b56c0d2f2aafc03aaf6b245310b9a849de42128e2d20ea03b2c05709d2cde04b"} Feb 24 10:12:15 crc kubenswrapper[4985]: I0224 10:12:15.934898 4985 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-rlq9j" event={"ID":"6ef771ef-28ac-46c6-925e-f12a7a70b6c3","Type":"ContainerStarted","Data":"dc31b55ac10c29aca78eb19bd76ffbaaf12fb4838d2dfa04b79f36a9cce91631"} Feb 24 10:12:15 crc kubenswrapper[4985]: I0224 10:12:15.945651 4985 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-w5s2c" event={"ID":"492804d6-fccf-4710-b64f-aef85fa80b2d","Type":"ContainerStarted","Data":"9d85865bf4cdf7f1d2d192b13d1871eaf80b0b3693c7b744d307005668b2df3f"} Feb 24 10:12:15 crc kubenswrapper[4985]: I0224 10:12:15.979149 4985 generic.go:334] "Generic (PLEG): container finished" podID="20592ddc-38f3-4407-a5c3-b719b89cd42e" containerID="e79df61ee8416172c627e78c1fe537792486444ac107125c0b8cdf78710fecf7" exitCode=0 Feb 24 10:12:15 crc kubenswrapper[4985]: I0224 10:12:15.979274 4985 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-t527v" event={"ID":"20592ddc-38f3-4407-a5c3-b719b89cd42e","Type":"ContainerDied","Data":"e79df61ee8416172c627e78c1fe537792486444ac107125c0b8cdf78710fecf7"} Feb 24 10:12:15 crc kubenswrapper[4985]: I0224 10:12:15.985678 4985 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-mvvxj" event={"ID":"1ef1d68d-0ae5-4c63-b7b3-40e39ad30a87","Type":"ContainerStarted","Data":"2fb1f882cd1d54d3e0e18989275a37cb5a4e0a3e60dbfa1366c4b6ef61e07730"} Feb 24 10:12:15 crc kubenswrapper[4985]: I0224 10:12:15.994034 4985 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-n8mff" event={"ID":"5dba6bb5-c7aa-45b0-82b4-990febac8e99","Type":"ContainerStarted","Data":"8df971f5ab5a813891bc058b76928a0c17d7c9a1daa23486c17f6e33c13e0294"} Feb 24 10:12:15 crc kubenswrapper[4985]: I0224 10:12:15.994332 4985 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-config-operator/openshift-config-operator-7777fb866f-n8mff" Feb 24 10:12:15 crc kubenswrapper[4985]: I0224 10:12:15.995289 4985 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ggvx6\" (UID: \"6d51aec2-2b30-49d0-8aea-019bea882940\") " pod="openshift-image-registry/image-registry-697d97f7c8-ggvx6" Feb 24 10:12:15 crc kubenswrapper[4985]: E0224 10:12:15.996761 4985 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-24 10:12:16.496744245 +0000 UTC m=+220.970936805 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ggvx6" (UID: "6d51aec2-2b30-49d0-8aea-019bea882940") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 24 10:12:16 crc kubenswrapper[4985]: I0224 10:12:16.006855 4985 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-6ssps" event={"ID":"ea70e899-59ad-4ef2-b407-81c67de39e50","Type":"ContainerStarted","Data":"4882da41fef5aa0d2ac96c397fce85e9ca1ee2e18758b866cbc2d63a531b24c4"} Feb 24 10:12:16 crc kubenswrapper[4985]: I0224 10:12:16.009709 4985 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-25d4p" event={"ID":"b85b9142-2d6e-47e9-8d88-bed8bbaa6df7","Type":"ContainerStarted","Data":"57896faef1c55a2e07a1587a2dc05902bb0109c2bab00151c6041687a206f2dc"} Feb 24 10:12:16 crc kubenswrapper[4985]: I0224 10:12:16.013339 4985 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-879f6c89f-nncfz" podUID="bbf9fdf8-7dd4-4bb2-9ff7-0349600747f8" containerName="controller-manager" containerID="cri-o://48538cdf04104ccaf5c39bd7234a49eddf1558e724666b8500239983bef959df" gracePeriod=30 Feb 24 10:12:16 crc kubenswrapper[4985]: I0224 10:12:16.018324 4985 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-config-operator/openshift-config-operator-7777fb866f-n8mff" podStartSLOduration=174.018308784 podStartE2EDuration="2m54.018308784s" podCreationTimestamp="2026-02-24 10:09:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-24 10:12:16.016824689 +0000 UTC m=+220.491017249" watchObservedRunningTime="2026-02-24 10:12:16.018308784 +0000 UTC m=+220.492501344" Feb 24 10:12:16 crc kubenswrapper[4985]: I0224 10:12:16.024850 4985 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-jlxbx" Feb 24 10:12:16 crc kubenswrapper[4985]: I0224 10:12:16.034942 4985 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd-operator/etcd-operator-b45778765-jhlml" podStartSLOduration=174.034918652 podStartE2EDuration="2m54.034918652s" podCreationTimestamp="2026-02-24 10:09:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-24 10:12:16.034241782 +0000 UTC m=+220.508434342" watchObservedRunningTime="2026-02-24 10:12:16.034918652 +0000 UTC m=+220.509111222" Feb 24 10:12:16 crc kubenswrapper[4985]: I0224 10:12:16.096101 4985 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 24 10:12:16 crc kubenswrapper[4985]: E0224 10:12:16.100959 4985 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-24 10:12:16.600930351 +0000 UTC m=+221.075122911 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 24 10:12:16 crc kubenswrapper[4985]: I0224 10:12:16.197749 4985 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ggvx6\" (UID: \"6d51aec2-2b30-49d0-8aea-019bea882940\") " pod="openshift-image-registry/image-registry-697d97f7c8-ggvx6" Feb 24 10:12:16 crc kubenswrapper[4985]: E0224 10:12:16.198420 4985 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-24 10:12:16.698409843 +0000 UTC m=+221.172602403 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ggvx6" (UID: "6d51aec2-2b30-49d0-8aea-019bea882940") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 24 10:12:16 crc kubenswrapper[4985]: I0224 10:12:16.250432 4985 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-t527v" Feb 24 10:12:16 crc kubenswrapper[4985]: I0224 10:12:16.299578 4985 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/20592ddc-38f3-4407-a5c3-b719b89cd42e-client-ca\") pod \"20592ddc-38f3-4407-a5c3-b719b89cd42e\" (UID: \"20592ddc-38f3-4407-a5c3-b719b89cd42e\") " Feb 24 10:12:16 crc kubenswrapper[4985]: I0224 10:12:16.299620 4985 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/20592ddc-38f3-4407-a5c3-b719b89cd42e-config\") pod \"20592ddc-38f3-4407-a5c3-b719b89cd42e\" (UID: \"20592ddc-38f3-4407-a5c3-b719b89cd42e\") " Feb 24 10:12:16 crc kubenswrapper[4985]: I0224 10:12:16.299691 4985 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/20592ddc-38f3-4407-a5c3-b719b89cd42e-serving-cert\") pod \"20592ddc-38f3-4407-a5c3-b719b89cd42e\" (UID: \"20592ddc-38f3-4407-a5c3-b719b89cd42e\") " Feb 24 10:12:16 crc kubenswrapper[4985]: I0224 10:12:16.299941 4985 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 24 10:12:16 crc kubenswrapper[4985]: I0224 10:12:16.299981 4985 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bbk9r\" (UniqueName: \"kubernetes.io/projected/20592ddc-38f3-4407-a5c3-b719b89cd42e-kube-api-access-bbk9r\") pod \"20592ddc-38f3-4407-a5c3-b719b89cd42e\" (UID: \"20592ddc-38f3-4407-a5c3-b719b89cd42e\") " Feb 24 10:12:16 crc kubenswrapper[4985]: E0224 10:12:16.305651 4985 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-24 10:12:16.805626201 +0000 UTC m=+221.279818761 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 24 10:12:16 crc kubenswrapper[4985]: I0224 10:12:16.306510 4985 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/20592ddc-38f3-4407-a5c3-b719b89cd42e-client-ca" (OuterVolumeSpecName: "client-ca") pod "20592ddc-38f3-4407-a5c3-b719b89cd42e" (UID: "20592ddc-38f3-4407-a5c3-b719b89cd42e"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 24 10:12:16 crc kubenswrapper[4985]: I0224 10:12:16.316053 4985 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/20592ddc-38f3-4407-a5c3-b719b89cd42e-config" (OuterVolumeSpecName: "config") pod "20592ddc-38f3-4407-a5c3-b719b89cd42e" (UID: "20592ddc-38f3-4407-a5c3-b719b89cd42e"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 24 10:12:16 crc kubenswrapper[4985]: I0224 10:12:16.328365 4985 ???:1] "http: TLS handshake error from 192.168.126.11:52700: no serving certificate available for the kubelet" Feb 24 10:12:16 crc kubenswrapper[4985]: I0224 10:12:16.330471 4985 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20592ddc-38f3-4407-a5c3-b719b89cd42e-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "20592ddc-38f3-4407-a5c3-b719b89cd42e" (UID: "20592ddc-38f3-4407-a5c3-b719b89cd42e"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 24 10:12:16 crc kubenswrapper[4985]: I0224 10:12:16.332813 4985 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/20592ddc-38f3-4407-a5c3-b719b89cd42e-kube-api-access-bbk9r" (OuterVolumeSpecName: "kube-api-access-bbk9r") pod "20592ddc-38f3-4407-a5c3-b719b89cd42e" (UID: "20592ddc-38f3-4407-a5c3-b719b89cd42e"). InnerVolumeSpecName "kube-api-access-bbk9r". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 24 10:12:16 crc kubenswrapper[4985]: I0224 10:12:16.402450 4985 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ggvx6\" (UID: \"6d51aec2-2b30-49d0-8aea-019bea882940\") " pod="openshift-image-registry/image-registry-697d97f7c8-ggvx6" Feb 24 10:12:16 crc kubenswrapper[4985]: I0224 10:12:16.402624 4985 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bbk9r\" (UniqueName: \"kubernetes.io/projected/20592ddc-38f3-4407-a5c3-b719b89cd42e-kube-api-access-bbk9r\") on node \"crc\" DevicePath \"\"" Feb 24 10:12:16 crc kubenswrapper[4985]: I0224 10:12:16.402643 4985 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/20592ddc-38f3-4407-a5c3-b719b89cd42e-client-ca\") on node \"crc\" DevicePath \"\"" Feb 24 10:12:16 crc kubenswrapper[4985]: I0224 10:12:16.402653 4985 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/20592ddc-38f3-4407-a5c3-b719b89cd42e-config\") on node \"crc\" DevicePath \"\"" Feb 24 10:12:16 crc kubenswrapper[4985]: I0224 10:12:16.402661 4985 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/20592ddc-38f3-4407-a5c3-b719b89cd42e-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 24 10:12:16 crc kubenswrapper[4985]: E0224 10:12:16.402880 4985 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-24 10:12:16.902870286 +0000 UTC m=+221.377062846 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ggvx6" (UID: "6d51aec2-2b30-49d0-8aea-019bea882940") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 24 10:12:16 crc kubenswrapper[4985]: I0224 10:12:16.505107 4985 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 24 10:12:16 crc kubenswrapper[4985]: E0224 10:12:16.505837 4985 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-24 10:12:17.005821225 +0000 UTC m=+221.480013785 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 24 10:12:16 crc kubenswrapper[4985]: I0224 10:12:16.610121 4985 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ggvx6\" (UID: \"6d51aec2-2b30-49d0-8aea-019bea882940\") " pod="openshift-image-registry/image-registry-697d97f7c8-ggvx6" Feb 24 10:12:16 crc kubenswrapper[4985]: E0224 10:12:16.611506 4985 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-24 10:12:17.111490857 +0000 UTC m=+221.585683417 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ggvx6" (UID: "6d51aec2-2b30-49d0-8aea-019bea882940") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 24 10:12:16 crc kubenswrapper[4985]: I0224 10:12:16.645023 4985 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-nncfz" Feb 24 10:12:16 crc kubenswrapper[4985]: I0224 10:12:16.711514 4985 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 24 10:12:16 crc kubenswrapper[4985]: I0224 10:12:16.711585 4985 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bbf9fdf8-7dd4-4bb2-9ff7-0349600747f8-serving-cert\") pod \"bbf9fdf8-7dd4-4bb2-9ff7-0349600747f8\" (UID: \"bbf9fdf8-7dd4-4bb2-9ff7-0349600747f8\") " Feb 24 10:12:16 crc kubenswrapper[4985]: I0224 10:12:16.711625 4985 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bbf9fdf8-7dd4-4bb2-9ff7-0349600747f8-config\") pod \"bbf9fdf8-7dd4-4bb2-9ff7-0349600747f8\" (UID: \"bbf9fdf8-7dd4-4bb2-9ff7-0349600747f8\") " Feb 24 10:12:16 crc kubenswrapper[4985]: I0224 10:12:16.711665 4985 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/bbf9fdf8-7dd4-4bb2-9ff7-0349600747f8-proxy-ca-bundles\") pod \"bbf9fdf8-7dd4-4bb2-9ff7-0349600747f8\" (UID: \"bbf9fdf8-7dd4-4bb2-9ff7-0349600747f8\") " Feb 24 10:12:16 crc kubenswrapper[4985]: E0224 10:12:16.711700 4985 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-24 10:12:17.211680861 +0000 UTC m=+221.685873421 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 24 10:12:16 crc kubenswrapper[4985]: I0224 10:12:16.711718 4985 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/bbf9fdf8-7dd4-4bb2-9ff7-0349600747f8-client-ca\") pod \"bbf9fdf8-7dd4-4bb2-9ff7-0349600747f8\" (UID: \"bbf9fdf8-7dd4-4bb2-9ff7-0349600747f8\") " Feb 24 10:12:16 crc kubenswrapper[4985]: I0224 10:12:16.711750 4985 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8skx2\" (UniqueName: \"kubernetes.io/projected/bbf9fdf8-7dd4-4bb2-9ff7-0349600747f8-kube-api-access-8skx2\") pod \"bbf9fdf8-7dd4-4bb2-9ff7-0349600747f8\" (UID: \"bbf9fdf8-7dd4-4bb2-9ff7-0349600747f8\") " Feb 24 10:12:16 crc kubenswrapper[4985]: I0224 10:12:16.711945 4985 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ggvx6\" (UID: \"6d51aec2-2b30-49d0-8aea-019bea882940\") " pod="openshift-image-registry/image-registry-697d97f7c8-ggvx6" Feb 24 10:12:16 crc kubenswrapper[4985]: E0224 10:12:16.712320 4985 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-24 10:12:17.212310241 +0000 UTC m=+221.686502801 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ggvx6" (UID: "6d51aec2-2b30-49d0-8aea-019bea882940") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 24 10:12:16 crc kubenswrapper[4985]: I0224 10:12:16.713004 4985 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bbf9fdf8-7dd4-4bb2-9ff7-0349600747f8-config" (OuterVolumeSpecName: "config") pod "bbf9fdf8-7dd4-4bb2-9ff7-0349600747f8" (UID: "bbf9fdf8-7dd4-4bb2-9ff7-0349600747f8"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 24 10:12:16 crc kubenswrapper[4985]: I0224 10:12:16.713549 4985 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bbf9fdf8-7dd4-4bb2-9ff7-0349600747f8-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "bbf9fdf8-7dd4-4bb2-9ff7-0349600747f8" (UID: "bbf9fdf8-7dd4-4bb2-9ff7-0349600747f8"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 24 10:12:16 crc kubenswrapper[4985]: I0224 10:12:16.713645 4985 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bbf9fdf8-7dd4-4bb2-9ff7-0349600747f8-client-ca" (OuterVolumeSpecName: "client-ca") pod "bbf9fdf8-7dd4-4bb2-9ff7-0349600747f8" (UID: "bbf9fdf8-7dd4-4bb2-9ff7-0349600747f8"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 24 10:12:16 crc kubenswrapper[4985]: I0224 10:12:16.728347 4985 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bbf9fdf8-7dd4-4bb2-9ff7-0349600747f8-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "bbf9fdf8-7dd4-4bb2-9ff7-0349600747f8" (UID: "bbf9fdf8-7dd4-4bb2-9ff7-0349600747f8"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 24 10:12:16 crc kubenswrapper[4985]: I0224 10:12:16.736630 4985 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bbf9fdf8-7dd4-4bb2-9ff7-0349600747f8-kube-api-access-8skx2" (OuterVolumeSpecName: "kube-api-access-8skx2") pod "bbf9fdf8-7dd4-4bb2-9ff7-0349600747f8" (UID: "bbf9fdf8-7dd4-4bb2-9ff7-0349600747f8"). InnerVolumeSpecName "kube-api-access-8skx2". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 24 10:12:16 crc kubenswrapper[4985]: I0224 10:12:16.740456 4985 patch_prober.go:28] interesting pod/router-default-5444994796-gcwfl container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 24 10:12:16 crc kubenswrapper[4985]: [-]has-synced failed: reason withheld Feb 24 10:12:16 crc kubenswrapper[4985]: [+]process-running ok Feb 24 10:12:16 crc kubenswrapper[4985]: healthz check failed Feb 24 10:12:16 crc kubenswrapper[4985]: I0224 10:12:16.740509 4985 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-gcwfl" podUID="4cc3b979-d591-4106-8874-760925fd10f6" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 24 10:12:16 crc kubenswrapper[4985]: I0224 10:12:16.813316 4985 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 24 10:12:16 crc kubenswrapper[4985]: E0224 10:12:16.813755 4985 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-24 10:12:17.313730572 +0000 UTC m=+221.787923132 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 24 10:12:16 crc kubenswrapper[4985]: I0224 10:12:16.814074 4985 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ggvx6\" (UID: \"6d51aec2-2b30-49d0-8aea-019bea882940\") " pod="openshift-image-registry/image-registry-697d97f7c8-ggvx6" Feb 24 10:12:16 crc kubenswrapper[4985]: I0224 10:12:16.814243 4985 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/bbf9fdf8-7dd4-4bb2-9ff7-0349600747f8-client-ca\") on node \"crc\" DevicePath \"\"" Feb 24 10:12:16 crc kubenswrapper[4985]: I0224 10:12:16.814267 4985 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8skx2\" (UniqueName: \"kubernetes.io/projected/bbf9fdf8-7dd4-4bb2-9ff7-0349600747f8-kube-api-access-8skx2\") on node \"crc\" DevicePath \"\"" Feb 24 10:12:16 crc kubenswrapper[4985]: I0224 10:12:16.814287 4985 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bbf9fdf8-7dd4-4bb2-9ff7-0349600747f8-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 24 10:12:16 crc kubenswrapper[4985]: I0224 10:12:16.814298 4985 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bbf9fdf8-7dd4-4bb2-9ff7-0349600747f8-config\") on node \"crc\" DevicePath \"\"" Feb 24 10:12:16 crc kubenswrapper[4985]: I0224 10:12:16.814310 4985 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/bbf9fdf8-7dd4-4bb2-9ff7-0349600747f8-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Feb 24 10:12:16 crc kubenswrapper[4985]: E0224 10:12:16.814557 4985 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-24 10:12:17.314547028 +0000 UTC m=+221.788739588 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ggvx6" (UID: "6d51aec2-2b30-49d0-8aea-019bea882940") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 24 10:12:16 crc kubenswrapper[4985]: I0224 10:12:16.915095 4985 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 24 10:12:16 crc kubenswrapper[4985]: E0224 10:12:16.915754 4985 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-24 10:12:17.415739142 +0000 UTC m=+221.889931702 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 24 10:12:16 crc kubenswrapper[4985]: I0224 10:12:16.943221 4985 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-ddbfd44ff-n5gs7"] Feb 24 10:12:16 crc kubenswrapper[4985]: E0224 10:12:16.943423 4985 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="20592ddc-38f3-4407-a5c3-b719b89cd42e" containerName="route-controller-manager" Feb 24 10:12:16 crc kubenswrapper[4985]: I0224 10:12:16.943437 4985 state_mem.go:107] "Deleted CPUSet assignment" podUID="20592ddc-38f3-4407-a5c3-b719b89cd42e" containerName="route-controller-manager" Feb 24 10:12:16 crc kubenswrapper[4985]: E0224 10:12:16.943463 4985 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bbf9fdf8-7dd4-4bb2-9ff7-0349600747f8" containerName="controller-manager" Feb 24 10:12:16 crc kubenswrapper[4985]: I0224 10:12:16.943469 4985 state_mem.go:107] "Deleted CPUSet assignment" podUID="bbf9fdf8-7dd4-4bb2-9ff7-0349600747f8" containerName="controller-manager" Feb 24 10:12:16 crc kubenswrapper[4985]: I0224 10:12:16.943559 4985 memory_manager.go:354] "RemoveStaleState removing state" podUID="bbf9fdf8-7dd4-4bb2-9ff7-0349600747f8" containerName="controller-manager" Feb 24 10:12:16 crc kubenswrapper[4985]: I0224 10:12:16.943567 4985 memory_manager.go:354] "RemoveStaleState removing state" podUID="20592ddc-38f3-4407-a5c3-b719b89cd42e" containerName="route-controller-manager" Feb 24 10:12:16 crc kubenswrapper[4985]: I0224 10:12:16.943929 4985 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-ddbfd44ff-n5gs7" Feb 24 10:12:16 crc kubenswrapper[4985]: I0224 10:12:16.944920 4985 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-7fd5bb98f6-xk6s7"] Feb 24 10:12:16 crc kubenswrapper[4985]: I0224 10:12:16.945432 4985 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7fd5bb98f6-xk6s7" Feb 24 10:12:16 crc kubenswrapper[4985]: I0224 10:12:16.961429 4985 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-ddbfd44ff-n5gs7"] Feb 24 10:12:16 crc kubenswrapper[4985]: I0224 10:12:16.980165 4985 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-7fd5bb98f6-xk6s7"] Feb 24 10:12:17 crc kubenswrapper[4985]: I0224 10:12:17.019494 4985 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b7825dac-b973-4913-bf15-eb8fc3fe6236-serving-cert\") pod \"controller-manager-7fd5bb98f6-xk6s7\" (UID: \"b7825dac-b973-4913-bf15-eb8fc3fe6236\") " pod="openshift-controller-manager/controller-manager-7fd5bb98f6-xk6s7" Feb 24 10:12:17 crc kubenswrapper[4985]: I0224 10:12:17.019549 4985 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/86bcd109-4d8a-427f-9c07-1e88c6e1ea22-client-ca\") pod \"route-controller-manager-ddbfd44ff-n5gs7\" (UID: \"86bcd109-4d8a-427f-9c07-1e88c6e1ea22\") " pod="openshift-route-controller-manager/route-controller-manager-ddbfd44ff-n5gs7" Feb 24 10:12:17 crc kubenswrapper[4985]: I0224 10:12:17.019571 4985 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/b7825dac-b973-4913-bf15-eb8fc3fe6236-client-ca\") pod \"controller-manager-7fd5bb98f6-xk6s7\" (UID: \"b7825dac-b973-4913-bf15-eb8fc3fe6236\") " pod="openshift-controller-manager/controller-manager-7fd5bb98f6-xk6s7" Feb 24 10:12:17 crc kubenswrapper[4985]: I0224 10:12:17.019598 4985 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/86bcd109-4d8a-427f-9c07-1e88c6e1ea22-config\") pod \"route-controller-manager-ddbfd44ff-n5gs7\" (UID: \"86bcd109-4d8a-427f-9c07-1e88c6e1ea22\") " pod="openshift-route-controller-manager/route-controller-manager-ddbfd44ff-n5gs7" Feb 24 10:12:17 crc kubenswrapper[4985]: I0224 10:12:17.019614 4985 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/86bcd109-4d8a-427f-9c07-1e88c6e1ea22-serving-cert\") pod \"route-controller-manager-ddbfd44ff-n5gs7\" (UID: \"86bcd109-4d8a-427f-9c07-1e88c6e1ea22\") " pod="openshift-route-controller-manager/route-controller-manager-ddbfd44ff-n5gs7" Feb 24 10:12:17 crc kubenswrapper[4985]: I0224 10:12:17.019629 4985 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b7825dac-b973-4913-bf15-eb8fc3fe6236-config\") pod \"controller-manager-7fd5bb98f6-xk6s7\" (UID: \"b7825dac-b973-4913-bf15-eb8fc3fe6236\") " pod="openshift-controller-manager/controller-manager-7fd5bb98f6-xk6s7" Feb 24 10:12:17 crc kubenswrapper[4985]: I0224 10:12:17.019655 4985 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ggvx6\" (UID: \"6d51aec2-2b30-49d0-8aea-019bea882940\") " pod="openshift-image-registry/image-registry-697d97f7c8-ggvx6" Feb 24 10:12:17 crc kubenswrapper[4985]: I0224 10:12:17.019681 4985 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jk6xc\" (UniqueName: \"kubernetes.io/projected/b7825dac-b973-4913-bf15-eb8fc3fe6236-kube-api-access-jk6xc\") pod \"controller-manager-7fd5bb98f6-xk6s7\" (UID: \"b7825dac-b973-4913-bf15-eb8fc3fe6236\") " pod="openshift-controller-manager/controller-manager-7fd5bb98f6-xk6s7" Feb 24 10:12:17 crc kubenswrapper[4985]: I0224 10:12:17.019698 4985 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/b7825dac-b973-4913-bf15-eb8fc3fe6236-proxy-ca-bundles\") pod \"controller-manager-7fd5bb98f6-xk6s7\" (UID: \"b7825dac-b973-4913-bf15-eb8fc3fe6236\") " pod="openshift-controller-manager/controller-manager-7fd5bb98f6-xk6s7" Feb 24 10:12:17 crc kubenswrapper[4985]: I0224 10:12:17.019725 4985 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rn87s\" (UniqueName: \"kubernetes.io/projected/86bcd109-4d8a-427f-9c07-1e88c6e1ea22-kube-api-access-rn87s\") pod \"route-controller-manager-ddbfd44ff-n5gs7\" (UID: \"86bcd109-4d8a-427f-9c07-1e88c6e1ea22\") " pod="openshift-route-controller-manager/route-controller-manager-ddbfd44ff-n5gs7" Feb 24 10:12:17 crc kubenswrapper[4985]: E0224 10:12:17.021067 4985 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-24 10:12:17.521051453 +0000 UTC m=+221.995244013 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ggvx6" (UID: "6d51aec2-2b30-49d0-8aea-019bea882940") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 24 10:12:17 crc kubenswrapper[4985]: I0224 10:12:17.089130 4985 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-zjjx8" event={"ID":"bce7d3e8-b855-43ae-8527-fbc14ac50521","Type":"ContainerStarted","Data":"128bdd333dd7c75a3cc1390eeab777ea83b06046f2f1d7409b84796112f48242"} Feb 24 10:12:17 crc kubenswrapper[4985]: I0224 10:12:17.091190 4985 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/downloads-7954f5f757-zjjx8" Feb 24 10:12:17 crc kubenswrapper[4985]: I0224 10:12:17.114458 4985 patch_prober.go:28] interesting pod/downloads-7954f5f757-zjjx8 container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.14:8080/\": dial tcp 10.217.0.14:8080: connect: connection refused" start-of-body= Feb 24 10:12:17 crc kubenswrapper[4985]: I0224 10:12:17.114521 4985 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-zjjx8" podUID="bce7d3e8-b855-43ae-8527-fbc14ac50521" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.14:8080/\": dial tcp 10.217.0.14:8080: connect: connection refused" Feb 24 10:12:17 crc kubenswrapper[4985]: I0224 10:12:17.126264 4985 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-lwqzt" event={"ID":"deefacd4-ae02-434c-bb74-2ac6b217e0e2","Type":"ContainerStarted","Data":"1ab45cd22c08bf015a49aa884afc642ca22eceb5e0ac6f3ace5f3ee945f3265c"} Feb 24 10:12:17 crc kubenswrapper[4985]: I0224 10:12:17.126310 4985 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-lwqzt" event={"ID":"deefacd4-ae02-434c-bb74-2ac6b217e0e2","Type":"ContainerStarted","Data":"99e7f3fec3858a8ef9fa686d9bfe2474ca4fc30ff924fd86d1718f126ce72c0b"} Feb 24 10:12:17 crc kubenswrapper[4985]: I0224 10:12:17.127359 4985 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 24 10:12:17 crc kubenswrapper[4985]: I0224 10:12:17.127641 4985 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b7825dac-b973-4913-bf15-eb8fc3fe6236-serving-cert\") pod \"controller-manager-7fd5bb98f6-xk6s7\" (UID: \"b7825dac-b973-4913-bf15-eb8fc3fe6236\") " pod="openshift-controller-manager/controller-manager-7fd5bb98f6-xk6s7" Feb 24 10:12:17 crc kubenswrapper[4985]: I0224 10:12:17.127676 4985 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/86bcd109-4d8a-427f-9c07-1e88c6e1ea22-client-ca\") pod \"route-controller-manager-ddbfd44ff-n5gs7\" (UID: \"86bcd109-4d8a-427f-9c07-1e88c6e1ea22\") " pod="openshift-route-controller-manager/route-controller-manager-ddbfd44ff-n5gs7" Feb 24 10:12:17 crc kubenswrapper[4985]: I0224 10:12:17.127694 4985 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/b7825dac-b973-4913-bf15-eb8fc3fe6236-client-ca\") pod \"controller-manager-7fd5bb98f6-xk6s7\" (UID: \"b7825dac-b973-4913-bf15-eb8fc3fe6236\") " pod="openshift-controller-manager/controller-manager-7fd5bb98f6-xk6s7" Feb 24 10:12:17 crc kubenswrapper[4985]: I0224 10:12:17.127719 4985 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/86bcd109-4d8a-427f-9c07-1e88c6e1ea22-config\") pod \"route-controller-manager-ddbfd44ff-n5gs7\" (UID: \"86bcd109-4d8a-427f-9c07-1e88c6e1ea22\") " pod="openshift-route-controller-manager/route-controller-manager-ddbfd44ff-n5gs7" Feb 24 10:12:17 crc kubenswrapper[4985]: I0224 10:12:17.127743 4985 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/86bcd109-4d8a-427f-9c07-1e88c6e1ea22-serving-cert\") pod \"route-controller-manager-ddbfd44ff-n5gs7\" (UID: \"86bcd109-4d8a-427f-9c07-1e88c6e1ea22\") " pod="openshift-route-controller-manager/route-controller-manager-ddbfd44ff-n5gs7" Feb 24 10:12:17 crc kubenswrapper[4985]: I0224 10:12:17.127758 4985 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b7825dac-b973-4913-bf15-eb8fc3fe6236-config\") pod \"controller-manager-7fd5bb98f6-xk6s7\" (UID: \"b7825dac-b973-4913-bf15-eb8fc3fe6236\") " pod="openshift-controller-manager/controller-manager-7fd5bb98f6-xk6s7" Feb 24 10:12:17 crc kubenswrapper[4985]: I0224 10:12:17.127790 4985 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/b7825dac-b973-4913-bf15-eb8fc3fe6236-proxy-ca-bundles\") pod \"controller-manager-7fd5bb98f6-xk6s7\" (UID: \"b7825dac-b973-4913-bf15-eb8fc3fe6236\") " pod="openshift-controller-manager/controller-manager-7fd5bb98f6-xk6s7" Feb 24 10:12:17 crc kubenswrapper[4985]: I0224 10:12:17.127808 4985 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jk6xc\" (UniqueName: \"kubernetes.io/projected/b7825dac-b973-4913-bf15-eb8fc3fe6236-kube-api-access-jk6xc\") pod \"controller-manager-7fd5bb98f6-xk6s7\" (UID: \"b7825dac-b973-4913-bf15-eb8fc3fe6236\") " pod="openshift-controller-manager/controller-manager-7fd5bb98f6-xk6s7" Feb 24 10:12:17 crc kubenswrapper[4985]: I0224 10:12:17.127844 4985 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rn87s\" (UniqueName: \"kubernetes.io/projected/86bcd109-4d8a-427f-9c07-1e88c6e1ea22-kube-api-access-rn87s\") pod \"route-controller-manager-ddbfd44ff-n5gs7\" (UID: \"86bcd109-4d8a-427f-9c07-1e88c6e1ea22\") " pod="openshift-route-controller-manager/route-controller-manager-ddbfd44ff-n5gs7" Feb 24 10:12:17 crc kubenswrapper[4985]: E0224 10:12:17.128110 4985 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-24 10:12:17.628097248 +0000 UTC m=+222.102289808 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 24 10:12:17 crc kubenswrapper[4985]: I0224 10:12:17.130433 4985 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/86bcd109-4d8a-427f-9c07-1e88c6e1ea22-client-ca\") pod \"route-controller-manager-ddbfd44ff-n5gs7\" (UID: \"86bcd109-4d8a-427f-9c07-1e88c6e1ea22\") " pod="openshift-route-controller-manager/route-controller-manager-ddbfd44ff-n5gs7" Feb 24 10:12:17 crc kubenswrapper[4985]: I0224 10:12:17.131026 4985 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/b7825dac-b973-4913-bf15-eb8fc3fe6236-client-ca\") pod \"controller-manager-7fd5bb98f6-xk6s7\" (UID: \"b7825dac-b973-4913-bf15-eb8fc3fe6236\") " pod="openshift-controller-manager/controller-manager-7fd5bb98f6-xk6s7" Feb 24 10:12:17 crc kubenswrapper[4985]: I0224 10:12:17.131974 4985 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/86bcd109-4d8a-427f-9c07-1e88c6e1ea22-config\") pod \"route-controller-manager-ddbfd44ff-n5gs7\" (UID: \"86bcd109-4d8a-427f-9c07-1e88c6e1ea22\") " pod="openshift-route-controller-manager/route-controller-manager-ddbfd44ff-n5gs7" Feb 24 10:12:17 crc kubenswrapper[4985]: I0224 10:12:17.133476 4985 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b7825dac-b973-4913-bf15-eb8fc3fe6236-config\") pod \"controller-manager-7fd5bb98f6-xk6s7\" (UID: \"b7825dac-b973-4913-bf15-eb8fc3fe6236\") " pod="openshift-controller-manager/controller-manager-7fd5bb98f6-xk6s7" Feb 24 10:12:17 crc kubenswrapper[4985]: I0224 10:12:17.140486 4985 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/b7825dac-b973-4913-bf15-eb8fc3fe6236-proxy-ca-bundles\") pod \"controller-manager-7fd5bb98f6-xk6s7\" (UID: \"b7825dac-b973-4913-bf15-eb8fc3fe6236\") " pod="openshift-controller-manager/controller-manager-7fd5bb98f6-xk6s7" Feb 24 10:12:17 crc kubenswrapper[4985]: I0224 10:12:17.146224 4985 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-4hthm" event={"ID":"da44f74b-f631-45cb-a8ad-78e36e49d2f1","Type":"ContainerStarted","Data":"405a29f87a393b424968a8e316a740837bb601aeea49ed2aeda71367b8ac4e18"} Feb 24 10:12:17 crc kubenswrapper[4985]: I0224 10:12:17.146276 4985 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-4hthm" event={"ID":"da44f74b-f631-45cb-a8ad-78e36e49d2f1","Type":"ContainerStarted","Data":"46898052ed85d515bcc88fe2ac89a245c022eef3d99d70d68e73821e94a014b1"} Feb 24 10:12:17 crc kubenswrapper[4985]: I0224 10:12:17.152739 4985 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/86bcd109-4d8a-427f-9c07-1e88c6e1ea22-serving-cert\") pod \"route-controller-manager-ddbfd44ff-n5gs7\" (UID: \"86bcd109-4d8a-427f-9c07-1e88c6e1ea22\") " pod="openshift-route-controller-manager/route-controller-manager-ddbfd44ff-n5gs7" Feb 24 10:12:17 crc kubenswrapper[4985]: I0224 10:12:17.160119 4985 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b7825dac-b973-4913-bf15-eb8fc3fe6236-serving-cert\") pod \"controller-manager-7fd5bb98f6-xk6s7\" (UID: \"b7825dac-b973-4913-bf15-eb8fc3fe6236\") " pod="openshift-controller-manager/controller-manager-7fd5bb98f6-xk6s7" Feb 24 10:12:17 crc kubenswrapper[4985]: I0224 10:12:17.160322 4985 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rn87s\" (UniqueName: \"kubernetes.io/projected/86bcd109-4d8a-427f-9c07-1e88c6e1ea22-kube-api-access-rn87s\") pod \"route-controller-manager-ddbfd44ff-n5gs7\" (UID: \"86bcd109-4d8a-427f-9c07-1e88c6e1ea22\") " pod="openshift-route-controller-manager/route-controller-manager-ddbfd44ff-n5gs7" Feb 24 10:12:17 crc kubenswrapper[4985]: I0224 10:12:17.162022 4985 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-pswjd" event={"ID":"122aaa41-07ff-451d-a71f-b343d6993525","Type":"ContainerStarted","Data":"8025662ed5581623a363ffe626b26757f56a8eb3120d7dd766d3c967ac796131"} Feb 24 10:12:17 crc kubenswrapper[4985]: I0224 10:12:17.162059 4985 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-pswjd" event={"ID":"122aaa41-07ff-451d-a71f-b343d6993525","Type":"ContainerStarted","Data":"8c2cbfa87a80036e37536951d4e753088e3b7cbd7607333efbff9f44988a42e3"} Feb 24 10:12:17 crc kubenswrapper[4985]: I0224 10:12:17.170911 4985 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jk6xc\" (UniqueName: \"kubernetes.io/projected/b7825dac-b973-4913-bf15-eb8fc3fe6236-kube-api-access-jk6xc\") pod \"controller-manager-7fd5bb98f6-xk6s7\" (UID: \"b7825dac-b973-4913-bf15-eb8fc3fe6236\") " pod="openshift-controller-manager/controller-manager-7fd5bb98f6-xk6s7" Feb 24 10:12:17 crc kubenswrapper[4985]: I0224 10:12:17.180729 4985 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-7954f5f757-zjjx8" podStartSLOduration=175.180707336 podStartE2EDuration="2m55.180707336s" podCreationTimestamp="2026-02-24 10:09:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-24 10:12:17.115214194 +0000 UTC m=+221.589406754" watchObservedRunningTime="2026-02-24 10:12:17.180707336 +0000 UTC m=+221.654899896" Feb 24 10:12:17 crc kubenswrapper[4985]: I0224 10:12:17.187803 4985 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-2hv9t" event={"ID":"c2ddf07b-fdd4-4d99-a09c-256ed526960f","Type":"ContainerStarted","Data":"0d1b9fe062a3cea815fadd3667ac999e37f2130a76ff41879cc0734e3e506ccc"} Feb 24 10:12:17 crc kubenswrapper[4985]: I0224 10:12:17.208229 4985 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-txrfz" event={"ID":"ad42793f-f18e-4948-95fe-84c7b2edcf9e","Type":"ContainerStarted","Data":"bb0d4d988404ed95a3d3b0c0209bd0d7e87aed8a68d339b1f0573cd77961ca30"} Feb 24 10:12:17 crc kubenswrapper[4985]: I0224 10:12:17.208283 4985 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-txrfz" event={"ID":"ad42793f-f18e-4948-95fe-84c7b2edcf9e","Type":"ContainerStarted","Data":"7b545846c76737c93132670da9c70429042a44f9b2e3b9c8c9de50af34ef9344"} Feb 24 10:12:17 crc kubenswrapper[4985]: I0224 10:12:17.211729 4985 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-lwqzt" podStartSLOduration=175.211708125 podStartE2EDuration="2m55.211708125s" podCreationTimestamp="2026-02-24 10:09:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-24 10:12:17.182870943 +0000 UTC m=+221.657063503" watchObservedRunningTime="2026-02-24 10:12:17.211708125 +0000 UTC m=+221.685900685" Feb 24 10:12:17 crc kubenswrapper[4985]: I0224 10:12:17.235601 4985 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ggvx6\" (UID: \"6d51aec2-2b30-49d0-8aea-019bea882940\") " pod="openshift-image-registry/image-registry-697d97f7c8-ggvx6" Feb 24 10:12:17 crc kubenswrapper[4985]: E0224 10:12:17.235886 4985 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-24 10:12:17.735871254 +0000 UTC m=+222.210063804 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ggvx6" (UID: "6d51aec2-2b30-49d0-8aea-019bea882940") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 24 10:12:17 crc kubenswrapper[4985]: I0224 10:12:17.238086 4985 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-pswjd" podStartSLOduration=175.238066371 podStartE2EDuration="2m55.238066371s" podCreationTimestamp="2026-02-24 10:09:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-24 10:12:17.209233739 +0000 UTC m=+221.683426299" watchObservedRunningTime="2026-02-24 10:12:17.238066371 +0000 UTC m=+221.712258931" Feb 24 10:12:17 crc kubenswrapper[4985]: I0224 10:12:17.238200 4985 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-prkmg" event={"ID":"2dc233e3-9d26-4ec2-957f-350400af638e","Type":"ContainerStarted","Data":"323ac8ff86ef7da3ddca2ad121aab19c90eb1d2700d5209f0a95736556dc51cc"} Feb 24 10:12:17 crc kubenswrapper[4985]: I0224 10:12:17.243445 4985 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-t527v" event={"ID":"20592ddc-38f3-4407-a5c3-b719b89cd42e","Type":"ContainerDied","Data":"ff6c482d44e81ef5551d2c16588d60c05c819ec59e8e3e405a92db2b3b95cc78"} Feb 24 10:12:17 crc kubenswrapper[4985]: I0224 10:12:17.244627 4985 scope.go:117] "RemoveContainer" containerID="e79df61ee8416172c627e78c1fe537792486444ac107125c0b8cdf78710fecf7" Feb 24 10:12:17 crc kubenswrapper[4985]: I0224 10:12:17.244786 4985 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-t527v" Feb 24 10:12:17 crc kubenswrapper[4985]: I0224 10:12:17.240893 4985 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-4hthm" podStartSLOduration=175.240887297 podStartE2EDuration="2m55.240887297s" podCreationTimestamp="2026-02-24 10:09:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-24 10:12:17.238727201 +0000 UTC m=+221.712919761" watchObservedRunningTime="2026-02-24 10:12:17.240887297 +0000 UTC m=+221.715079857" Feb 24 10:12:17 crc kubenswrapper[4985]: I0224 10:12:17.270039 4985 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-ddbfd44ff-n5gs7" Feb 24 10:12:17 crc kubenswrapper[4985]: I0224 10:12:17.284276 4985 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-2xfms" event={"ID":"b6a6001a-7b27-4dc4-a1c5-5bbaa449dfe0","Type":"ContainerStarted","Data":"2bbf5dd735662504a85ee0f16a01c6aae6aa287c1ed176d900d5448434954db9"} Feb 24 10:12:17 crc kubenswrapper[4985]: I0224 10:12:17.284938 4985 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7fd5bb98f6-xk6s7" Feb 24 10:12:17 crc kubenswrapper[4985]: I0224 10:12:17.286802 4985 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-2hxbt" event={"ID":"9debae0d-c342-4558-8796-bc9798c297de","Type":"ContainerStarted","Data":"1432b5da7713afdd51395ebc8bccb9f524584e0f3a57ec9c4b70d196c54cf7e6"} Feb 24 10:12:17 crc kubenswrapper[4985]: I0224 10:12:17.286833 4985 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-2hxbt" event={"ID":"9debae0d-c342-4558-8796-bc9798c297de","Type":"ContainerStarted","Data":"98970b08825d9328dd3a93275b3b1a67b9bd2ca7c7d76a3a24c97fbe0062ed7b"} Feb 24 10:12:17 crc kubenswrapper[4985]: I0224 10:12:17.288194 4985 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-rlq9j" event={"ID":"6ef771ef-28ac-46c6-925e-f12a7a70b6c3","Type":"ContainerStarted","Data":"c094ee19a2d18359139a745230e588b4a99d5fd2e567eff67655a43342efae0f"} Feb 24 10:12:17 crc kubenswrapper[4985]: I0224 10:12:17.288722 4985 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-rlq9j" Feb 24 10:12:17 crc kubenswrapper[4985]: I0224 10:12:17.289741 4985 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-mlncn" event={"ID":"db31f9f1-19fc-47b3-8570-5428789afce0","Type":"ContainerStarted","Data":"5af237f8451ba70edba7dceb7b40db6fc81dff13aff7f2040d3e72e32af32b4b"} Feb 24 10:12:17 crc kubenswrapper[4985]: I0224 10:12:17.289760 4985 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-mlncn" event={"ID":"db31f9f1-19fc-47b3-8570-5428789afce0","Type":"ContainerStarted","Data":"81fe2d35739a3a9405e8982a78b064b8acc6fb0403ab4e3519229693176fe216"} Feb 24 10:12:17 crc kubenswrapper[4985]: I0224 10:12:17.291320 4985 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-w5s2c" event={"ID":"492804d6-fccf-4710-b64f-aef85fa80b2d","Type":"ContainerStarted","Data":"c59ec8ee69d28932ad0836a9aa0d732cf21c4f3e12e03c13e011d0d500d11419"} Feb 24 10:12:17 crc kubenswrapper[4985]: I0224 10:12:17.291693 4985 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-dns/dns-default-w5s2c" Feb 24 10:12:17 crc kubenswrapper[4985]: I0224 10:12:17.292528 4985 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-2vcf4" event={"ID":"45e61914-7807-43c3-b80b-a5b355b7447e","Type":"ContainerStarted","Data":"43b3a59bcb40c5e7e589d3a5ba0cf3051a920b1010497d91dbd37084ab9911f3"} Feb 24 10:12:17 crc kubenswrapper[4985]: I0224 10:12:17.293153 4985 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-2vcf4" Feb 24 10:12:17 crc kubenswrapper[4985]: I0224 10:12:17.293984 4985 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-25d4p" event={"ID":"b85b9142-2d6e-47e9-8d88-bed8bbaa6df7","Type":"ContainerStarted","Data":"dfdc06347248145365837bc10fa53cb2f75acb4a166ec9718c6d132c68cb804e"} Feb 24 10:12:17 crc kubenswrapper[4985]: I0224 10:12:17.300167 4985 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-prkmg" podStartSLOduration=175.30014913 podStartE2EDuration="2m55.30014913s" podCreationTimestamp="2026-02-24 10:09:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-24 10:12:17.268004937 +0000 UTC m=+221.742197497" watchObservedRunningTime="2026-02-24 10:12:17.30014913 +0000 UTC m=+221.774341700" Feb 24 10:12:17 crc kubenswrapper[4985]: I0224 10:12:17.301556 4985 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-ww7q7" event={"ID":"34aea390-2b22-43e4-aabd-8c3d13390620","Type":"ContainerStarted","Data":"7b50a7d9639092eea4ba91c2cdb9e95478d4532ba25358788f5e3ea4fba9da1f"} Feb 24 10:12:17 crc kubenswrapper[4985]: I0224 10:12:17.301592 4985 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-ww7q7" event={"ID":"34aea390-2b22-43e4-aabd-8c3d13390620","Type":"ContainerStarted","Data":"9c55b237ac6142f4dd605d60fe12addac9ab5f40327bb943700ab52aa4d128c5"} Feb 24 10:12:17 crc kubenswrapper[4985]: I0224 10:12:17.301601 4985 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-ww7q7" event={"ID":"34aea390-2b22-43e4-aabd-8c3d13390620","Type":"ContainerStarted","Data":"1ddea4a4c07a68161e702386edf83b29ffe0e919ad897b417d0f6e0c9b10a950"} Feb 24 10:12:17 crc kubenswrapper[4985]: I0224 10:12:17.301674 4985 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-2hv9t" podStartSLOduration=175.301666616 podStartE2EDuration="2m55.301666616s" podCreationTimestamp="2026-02-24 10:09:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-24 10:12:17.299352115 +0000 UTC m=+221.773544695" watchObservedRunningTime="2026-02-24 10:12:17.301666616 +0000 UTC m=+221.775859176" Feb 24 10:12:17 crc kubenswrapper[4985]: I0224 10:12:17.311371 4985 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-rlq9j container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.33:8080/healthz\": dial tcp 10.217.0.33:8080: connect: connection refused" start-of-body= Feb 24 10:12:17 crc kubenswrapper[4985]: I0224 10:12:17.311426 4985 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-rlq9j" podUID="6ef771ef-28ac-46c6-925e-f12a7a70b6c3" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.33:8080/healthz\": dial tcp 10.217.0.33:8080: connect: connection refused" Feb 24 10:12:17 crc kubenswrapper[4985]: I0224 10:12:17.320257 4985 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-rmswq" event={"ID":"e3da134f-45d4-4d12-bc38-87546e0920cd","Type":"ContainerStarted","Data":"9207861b54c84fc5e0939c65b11393f29f0d07f9e0007d756c1ea5b4e43cdc99"} Feb 24 10:12:17 crc kubenswrapper[4985]: I0224 10:12:17.320787 4985 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-9c57cc56f-txrfz" podStartSLOduration=175.32076634 podStartE2EDuration="2m55.32076634s" podCreationTimestamp="2026-02-24 10:09:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-24 10:12:17.318021066 +0000 UTC m=+221.792213636" watchObservedRunningTime="2026-02-24 10:12:17.32076634 +0000 UTC m=+221.794958900" Feb 24 10:12:17 crc kubenswrapper[4985]: I0224 10:12:17.326349 4985 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29532120-q5v4c" event={"ID":"2364f27b-fc75-4e56-8122-d6bdcc763b0a","Type":"ContainerStarted","Data":"62739b6d86536f8931eb7e7b691c2a9f9d59ebcfe45b7f34cb1fb931de68a641"} Feb 24 10:12:17 crc kubenswrapper[4985]: I0224 10:12:17.327618 4985 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-2vcf4" Feb 24 10:12:17 crc kubenswrapper[4985]: I0224 10:12:17.330871 4985 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-nnr7f" event={"ID":"8905bada-0871-41b5-9a21-83e4d3884edf","Type":"ContainerStarted","Data":"17b3fe39e613464e95d0ccd3cd4cee865ff32ab6a246b0f434f9c0e8a2aab459"} Feb 24 10:12:17 crc kubenswrapper[4985]: I0224 10:12:17.336069 4985 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 24 10:12:17 crc kubenswrapper[4985]: E0224 10:12:17.337500 4985 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-24 10:12:17.837485681 +0000 UTC m=+222.311678241 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 24 10:12:17 crc kubenswrapper[4985]: I0224 10:12:17.337877 4985 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-mvvxj" event={"ID":"1ef1d68d-0ae5-4c63-b7b3-40e39ad30a87","Type":"ContainerStarted","Data":"ba288c601522a34ef7124352855b89724afbbb7edeca8c7062f4b8a9c60957da"} Feb 24 10:12:17 crc kubenswrapper[4985]: I0224 10:12:17.340462 4985 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-mvvxj" Feb 24 10:12:17 crc kubenswrapper[4985]: I0224 10:12:17.359084 4985 patch_prober.go:28] interesting pod/olm-operator-6b444d44fb-mvvxj container/olm-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.26:8443/healthz\": dial tcp 10.217.0.26:8443: connect: connection refused" start-of-body= Feb 24 10:12:17 crc kubenswrapper[4985]: I0224 10:12:17.359133 4985 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-mvvxj" podUID="1ef1d68d-0ae5-4c63-b7b3-40e39ad30a87" containerName="olm-operator" probeResult="failure" output="Get \"https://10.217.0.26:8443/healthz\": dial tcp 10.217.0.26:8443: connect: connection refused" Feb 24 10:12:17 crc kubenswrapper[4985]: I0224 10:12:17.377939 4985 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-2vcf4" podStartSLOduration=175.377922078 podStartE2EDuration="2m55.377922078s" podCreationTimestamp="2026-02-24 10:09:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-24 10:12:17.375947668 +0000 UTC m=+221.850140238" watchObservedRunningTime="2026-02-24 10:12:17.377922078 +0000 UTC m=+221.852114638" Feb 24 10:12:17 crc kubenswrapper[4985]: I0224 10:12:17.395423 4985 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-hfkq2" event={"ID":"4389db90-c320-4227-b4cc-efc14354ac37","Type":"ContainerStarted","Data":"e35e70dfc0bd537cf1668450a64d040bfcfe7cdccfd8beb1725df68f4e0e16ee"} Feb 24 10:12:17 crc kubenswrapper[4985]: I0224 10:12:17.395462 4985 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-hfkq2" event={"ID":"4389db90-c320-4227-b4cc-efc14354ac37","Type":"ContainerStarted","Data":"c2022454a2c73cef2a4cc6b2c00ed277e084fefa1c7d3194a68f26c6f398b6e6"} Feb 24 10:12:17 crc kubenswrapper[4985]: I0224 10:12:17.395472 4985 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-hfkq2" event={"ID":"4389db90-c320-4227-b4cc-efc14354ac37","Type":"ContainerStarted","Data":"51599a3ddfd3c5627c4b33dfe8c5313767cb915b9b7651b5804cbd8567a5b7dd"} Feb 24 10:12:17 crc kubenswrapper[4985]: I0224 10:12:17.396020 4985 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-hfkq2" Feb 24 10:12:17 crc kubenswrapper[4985]: I0224 10:12:17.411138 4985 generic.go:334] "Generic (PLEG): container finished" podID="bbf9fdf8-7dd4-4bb2-9ff7-0349600747f8" containerID="48538cdf04104ccaf5c39bd7234a49eddf1558e724666b8500239983bef959df" exitCode=0 Feb 24 10:12:17 crc kubenswrapper[4985]: I0224 10:12:17.411233 4985 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-nncfz" event={"ID":"bbf9fdf8-7dd4-4bb2-9ff7-0349600747f8","Type":"ContainerDied","Data":"48538cdf04104ccaf5c39bd7234a49eddf1558e724666b8500239983bef959df"} Feb 24 10:12:17 crc kubenswrapper[4985]: I0224 10:12:17.411262 4985 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-nncfz" event={"ID":"bbf9fdf8-7dd4-4bb2-9ff7-0349600747f8","Type":"ContainerDied","Data":"ebf79fb408673c2d0093fd42f3fea35acedeef7df5b0c6bea113c386afda222d"} Feb 24 10:12:17 crc kubenswrapper[4985]: I0224 10:12:17.411279 4985 scope.go:117] "RemoveContainer" containerID="48538cdf04104ccaf5c39bd7234a49eddf1558e724666b8500239983bef959df" Feb 24 10:12:17 crc kubenswrapper[4985]: I0224 10:12:17.411391 4985 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-nncfz" Feb 24 10:12:17 crc kubenswrapper[4985]: I0224 10:12:17.437057 4985 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-6ssps" event={"ID":"ea70e899-59ad-4ef2-b407-81c67de39e50","Type":"ContainerStarted","Data":"6ec94289e0a76002f2659846ad095f19b4a76ed006e514b23085cda377a21395"} Feb 24 10:12:17 crc kubenswrapper[4985]: I0224 10:12:17.438389 4985 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ggvx6\" (UID: \"6d51aec2-2b30-49d0-8aea-019bea882940\") " pod="openshift-image-registry/image-registry-697d97f7c8-ggvx6" Feb 24 10:12:17 crc kubenswrapper[4985]: E0224 10:12:17.442115 4985 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-24 10:12:17.942101721 +0000 UTC m=+222.416294281 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ggvx6" (UID: "6d51aec2-2b30-49d0-8aea-019bea882940") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 24 10:12:17 crc kubenswrapper[4985]: I0224 10:12:17.468811 4985 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-t8w87" event={"ID":"751d5065-e3f3-4fe5-9d24-ca6c7197d0d6","Type":"ContainerStarted","Data":"522f33baf0623980ce283bcfb60042867c3c02498199745896a6fd24a3263df1"} Feb 24 10:12:17 crc kubenswrapper[4985]: I0224 10:12:17.489258 4985 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-225jm" event={"ID":"139ca41b-7c70-4db2-a9bf-a5495fdeadbe","Type":"ContainerStarted","Data":"b8427f16c7d0eac2f3ca420fe3538163ec0e1a92d5896a4eb3d2b6c40292f504"} Feb 24 10:12:17 crc kubenswrapper[4985]: I0224 10:12:17.503086 4985 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-cfp92" event={"ID":"4d3661f1-4dcc-48a4-9129-7052c5a2d098","Type":"ContainerStarted","Data":"39bbf9e1fee273add8cbedd132821f737ba27875f08afa81b9534115261e14dd"} Feb 24 10:12:17 crc kubenswrapper[4985]: I0224 10:12:17.503159 4985 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-cfp92" Feb 24 10:12:17 crc kubenswrapper[4985]: I0224 10:12:17.503882 4985 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-rlq9j" podStartSLOduration=175.50386513 podStartE2EDuration="2m55.50386513s" podCreationTimestamp="2026-02-24 10:09:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-24 10:12:17.448307782 +0000 UTC m=+221.922500342" watchObservedRunningTime="2026-02-24 10:12:17.50386513 +0000 UTC m=+221.978057690" Feb 24 10:12:17 crc kubenswrapper[4985]: I0224 10:12:17.517712 4985 scope.go:117] "RemoveContainer" containerID="48538cdf04104ccaf5c39bd7234a49eddf1558e724666b8500239983bef959df" Feb 24 10:12:17 crc kubenswrapper[4985]: E0224 10:12:17.518156 4985 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"48538cdf04104ccaf5c39bd7234a49eddf1558e724666b8500239983bef959df\": container with ID starting with 48538cdf04104ccaf5c39bd7234a49eddf1558e724666b8500239983bef959df not found: ID does not exist" containerID="48538cdf04104ccaf5c39bd7234a49eddf1558e724666b8500239983bef959df" Feb 24 10:12:17 crc kubenswrapper[4985]: I0224 10:12:17.518184 4985 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"48538cdf04104ccaf5c39bd7234a49eddf1558e724666b8500239983bef959df"} err="failed to get container status \"48538cdf04104ccaf5c39bd7234a49eddf1558e724666b8500239983bef959df\": rpc error: code = NotFound desc = could not find container \"48538cdf04104ccaf5c39bd7234a49eddf1558e724666b8500239983bef959df\": container with ID starting with 48538cdf04104ccaf5c39bd7234a49eddf1558e724666b8500239983bef959df not found: ID does not exist" Feb 24 10:12:17 crc kubenswrapper[4985]: I0224 10:12:17.522229 4985 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-cfp92 container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.39:5443/healthz\": dial tcp 10.217.0.39:5443: connect: connection refused" start-of-body= Feb 24 10:12:17 crc kubenswrapper[4985]: I0224 10:12:17.522272 4985 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-cfp92" podUID="4d3661f1-4dcc-48a4-9129-7052c5a2d098" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.39:5443/healthz\": dial tcp 10.217.0.39:5443: connect: connection refused" Feb 24 10:12:17 crc kubenswrapper[4985]: I0224 10:12:17.545750 4985 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-admission-controller-857f4d67dd-mlncn" podStartSLOduration=175.545734801 podStartE2EDuration="2m55.545734801s" podCreationTimestamp="2026-02-24 10:09:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-24 10:12:17.517521078 +0000 UTC m=+221.991713638" watchObservedRunningTime="2026-02-24 10:12:17.545734801 +0000 UTC m=+222.019927361" Feb 24 10:12:17 crc kubenswrapper[4985]: I0224 10:12:17.549291 4985 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 24 10:12:17 crc kubenswrapper[4985]: E0224 10:12:17.562372 4985 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-24 10:12:18.062344669 +0000 UTC m=+222.536537249 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 24 10:12:17 crc kubenswrapper[4985]: I0224 10:12:17.634166 4985 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-2xfms" podStartSLOduration=175.634149825 podStartE2EDuration="2m55.634149825s" podCreationTimestamp="2026-02-24 10:09:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-24 10:12:17.54830401 +0000 UTC m=+222.022496570" watchObservedRunningTime="2026-02-24 10:12:17.634149825 +0000 UTC m=+222.108342385" Feb 24 10:12:17 crc kubenswrapper[4985]: I0224 10:12:17.654643 4985 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ggvx6\" (UID: \"6d51aec2-2b30-49d0-8aea-019bea882940\") " pod="openshift-image-registry/image-registry-697d97f7c8-ggvx6" Feb 24 10:12:17 crc kubenswrapper[4985]: E0224 10:12:17.656477 4985 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-24 10:12:18.156462128 +0000 UTC m=+222.630654688 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ggvx6" (UID: "6d51aec2-2b30-49d0-8aea-019bea882940") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 24 10:12:17 crc kubenswrapper[4985]: I0224 10:12:17.748123 4985 patch_prober.go:28] interesting pod/router-default-5444994796-gcwfl container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 24 10:12:17 crc kubenswrapper[4985]: [-]has-synced failed: reason withheld Feb 24 10:12:17 crc kubenswrapper[4985]: [+]process-running ok Feb 24 10:12:17 crc kubenswrapper[4985]: healthz check failed Feb 24 10:12:17 crc kubenswrapper[4985]: I0224 10:12:17.748182 4985 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-gcwfl" podUID="4cc3b979-d591-4106-8874-760925fd10f6" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 24 10:12:17 crc kubenswrapper[4985]: I0224 10:12:17.758672 4985 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-ww7q7" podStartSLOduration=175.758653463 podStartE2EDuration="2m55.758653463s" podCreationTimestamp="2026-02-24 10:09:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-24 10:12:17.634517556 +0000 UTC m=+222.108710126" watchObservedRunningTime="2026-02-24 10:12:17.758653463 +0000 UTC m=+222.232846023" Feb 24 10:12:17 crc kubenswrapper[4985]: I0224 10:12:17.759997 4985 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 24 10:12:17 crc kubenswrapper[4985]: E0224 10:12:17.760315 4985 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-24 10:12:18.260301223 +0000 UTC m=+222.734493783 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 24 10:12:17 crc kubenswrapper[4985]: I0224 10:12:17.762496 4985 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-w5s2c" podStartSLOduration=7.762485741 podStartE2EDuration="7.762485741s" podCreationTimestamp="2026-02-24 10:12:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-24 10:12:17.755468166 +0000 UTC m=+222.229660726" watchObservedRunningTime="2026-02-24 10:12:17.762485741 +0000 UTC m=+222.236678301" Feb 24 10:12:17 crc kubenswrapper[4985]: I0224 10:12:17.844029 4985 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-7fd5bb98f6-xk6s7"] Feb 24 10:12:17 crc kubenswrapper[4985]: I0224 10:12:17.846586 4985 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-t527v"] Feb 24 10:12:17 crc kubenswrapper[4985]: I0224 10:12:17.859302 4985 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-t527v"] Feb 24 10:12:17 crc kubenswrapper[4985]: I0224 10:12:17.859935 4985 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-25d4p" podStartSLOduration=175.85991705 podStartE2EDuration="2m55.85991705s" podCreationTimestamp="2026-02-24 10:09:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-24 10:12:17.850440601 +0000 UTC m=+222.324633161" watchObservedRunningTime="2026-02-24 10:12:17.85991705 +0000 UTC m=+222.334109610" Feb 24 10:12:17 crc kubenswrapper[4985]: I0224 10:12:17.863555 4985 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ggvx6\" (UID: \"6d51aec2-2b30-49d0-8aea-019bea882940\") " pod="openshift-image-registry/image-registry-697d97f7c8-ggvx6" Feb 24 10:12:17 crc kubenswrapper[4985]: E0224 10:12:17.863885 4985 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-24 10:12:18.363873491 +0000 UTC m=+222.838066051 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ggvx6" (UID: "6d51aec2-2b30-49d0-8aea-019bea882940") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 24 10:12:17 crc kubenswrapper[4985]: I0224 10:12:17.882981 4985 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-2hxbt" podStartSLOduration=175.882963506 podStartE2EDuration="2m55.882963506s" podCreationTimestamp="2026-02-24 10:09:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-24 10:12:17.882759539 +0000 UTC m=+222.356952099" watchObservedRunningTime="2026-02-24 10:12:17.882963506 +0000 UTC m=+222.357156066" Feb 24 10:12:17 crc kubenswrapper[4985]: I0224 10:12:17.924789 4985 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-server-6ssps" podStartSLOduration=7.924771144 podStartE2EDuration="7.924771144s" podCreationTimestamp="2026-02-24 10:12:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-24 10:12:17.923356831 +0000 UTC m=+222.397549391" watchObservedRunningTime="2026-02-24 10:12:17.924771144 +0000 UTC m=+222.398963704" Feb 24 10:12:17 crc kubenswrapper[4985]: I0224 10:12:17.968587 4985 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 24 10:12:17 crc kubenswrapper[4985]: E0224 10:12:17.969386 4985 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-24 10:12:18.469368809 +0000 UTC m=+222.943561369 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 24 10:12:17 crc kubenswrapper[4985]: I0224 10:12:17.970500 4985 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns-operator/dns-operator-744455d44c-t8w87" podStartSLOduration=175.970479782 podStartE2EDuration="2m55.970479782s" podCreationTimestamp="2026-02-24 10:09:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-24 10:12:17.968269275 +0000 UTC m=+222.442461835" watchObservedRunningTime="2026-02-24 10:12:17.970479782 +0000 UTC m=+222.444672342" Feb 24 10:12:17 crc kubenswrapper[4985]: I0224 10:12:17.994751 4985 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-hfkq2" podStartSLOduration=175.994735454 podStartE2EDuration="2m55.994735454s" podCreationTimestamp="2026-02-24 10:09:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-24 10:12:17.994043643 +0000 UTC m=+222.468236203" watchObservedRunningTime="2026-02-24 10:12:17.994735454 +0000 UTC m=+222.468928014" Feb 24 10:12:18 crc kubenswrapper[4985]: I0224 10:12:18.020404 4985 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-777779d784-225jm" podStartSLOduration=176.020380018 podStartE2EDuration="2m56.020380018s" podCreationTimestamp="2026-02-24 10:09:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-24 10:12:18.019279435 +0000 UTC m=+222.493471995" watchObservedRunningTime="2026-02-24 10:12:18.020380018 +0000 UTC m=+222.494572578" Feb 24 10:12:18 crc kubenswrapper[4985]: I0224 10:12:18.050331 4985 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-nncfz"] Feb 24 10:12:18 crc kubenswrapper[4985]: I0224 10:12:18.068512 4985 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-nncfz"] Feb 24 10:12:18 crc kubenswrapper[4985]: I0224 10:12:18.071175 4985 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ggvx6\" (UID: \"6d51aec2-2b30-49d0-8aea-019bea882940\") " pod="openshift-image-registry/image-registry-697d97f7c8-ggvx6" Feb 24 10:12:18 crc kubenswrapper[4985]: E0224 10:12:18.071848 4985 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-24 10:12:18.571826142 +0000 UTC m=+223.046018702 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ggvx6" (UID: "6d51aec2-2b30-49d0-8aea-019bea882940") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 24 10:12:18 crc kubenswrapper[4985]: I0224 10:12:18.076568 4985 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver/apiserver-76f77b778f-nnr7f" podStartSLOduration=177.076550206 podStartE2EDuration="2m57.076550206s" podCreationTimestamp="2026-02-24 10:09:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-24 10:12:18.07600522 +0000 UTC m=+222.550197790" watchObservedRunningTime="2026-02-24 10:12:18.076550206 +0000 UTC m=+222.550742766" Feb 24 10:12:18 crc kubenswrapper[4985]: I0224 10:12:18.105314 4985 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-mvvxj" podStartSLOduration=176.105288976 podStartE2EDuration="2m56.105288976s" podCreationTimestamp="2026-02-24 10:09:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-24 10:12:18.096669702 +0000 UTC m=+222.570862282" watchObservedRunningTime="2026-02-24 10:12:18.105288976 +0000 UTC m=+222.579481536" Feb 24 10:12:18 crc kubenswrapper[4985]: I0224 10:12:18.138540 4985 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-cfp92" podStartSLOduration=176.138514061 podStartE2EDuration="2m56.138514061s" podCreationTimestamp="2026-02-24 10:09:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-24 10:12:18.137953995 +0000 UTC m=+222.612146555" watchObservedRunningTime="2026-02-24 10:12:18.138514061 +0000 UTC m=+222.612706621" Feb 24 10:12:18 crc kubenswrapper[4985]: I0224 10:12:18.173033 4985 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 24 10:12:18 crc kubenswrapper[4985]: E0224 10:12:18.173944 4985 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-24 10:12:18.673918575 +0000 UTC m=+223.148111135 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 24 10:12:18 crc kubenswrapper[4985]: I0224 10:12:18.213388 4985 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-rmswq" podStartSLOduration=8.213371881 podStartE2EDuration="8.213371881s" podCreationTimestamp="2026-02-24 10:12:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-24 10:12:18.179839656 +0000 UTC m=+222.654032216" watchObservedRunningTime="2026-02-24 10:12:18.213371881 +0000 UTC m=+222.687564441" Feb 24 10:12:18 crc kubenswrapper[4985]: I0224 10:12:18.274666 4985 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ggvx6\" (UID: \"6d51aec2-2b30-49d0-8aea-019bea882940\") " pod="openshift-image-registry/image-registry-697d97f7c8-ggvx6" Feb 24 10:12:18 crc kubenswrapper[4985]: I0224 10:12:18.274936 4985 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29532120-q5v4c" podStartSLOduration=176.274892803 podStartE2EDuration="2m56.274892803s" podCreationTimestamp="2026-02-24 10:09:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-24 10:12:18.247717041 +0000 UTC m=+222.721909611" watchObservedRunningTime="2026-02-24 10:12:18.274892803 +0000 UTC m=+222.749085363" Feb 24 10:12:18 crc kubenswrapper[4985]: E0224 10:12:18.274997 4985 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-24 10:12:18.774984616 +0000 UTC m=+223.249177176 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ggvx6" (UID: "6d51aec2-2b30-49d0-8aea-019bea882940") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 24 10:12:18 crc kubenswrapper[4985]: I0224 10:12:18.277691 4985 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="20592ddc-38f3-4407-a5c3-b719b89cd42e" path="/var/lib/kubelet/pods/20592ddc-38f3-4407-a5c3-b719b89cd42e/volumes" Feb 24 10:12:18 crc kubenswrapper[4985]: I0224 10:12:18.293997 4985 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bbf9fdf8-7dd4-4bb2-9ff7-0349600747f8" path="/var/lib/kubelet/pods/bbf9fdf8-7dd4-4bb2-9ff7-0349600747f8/volumes" Feb 24 10:12:18 crc kubenswrapper[4985]: I0224 10:12:18.294826 4985 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-ddbfd44ff-n5gs7"] Feb 24 10:12:18 crc kubenswrapper[4985]: I0224 10:12:18.375361 4985 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 24 10:12:18 crc kubenswrapper[4985]: E0224 10:12:18.375532 4985 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-24 10:12:18.875505751 +0000 UTC m=+223.349698311 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 24 10:12:18 crc kubenswrapper[4985]: I0224 10:12:18.375699 4985 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ggvx6\" (UID: \"6d51aec2-2b30-49d0-8aea-019bea882940\") " pod="openshift-image-registry/image-registry-697d97f7c8-ggvx6" Feb 24 10:12:18 crc kubenswrapper[4985]: E0224 10:12:18.375979 4985 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-24 10:12:18.875972275 +0000 UTC m=+223.350164835 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ggvx6" (UID: "6d51aec2-2b30-49d0-8aea-019bea882940") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 24 10:12:18 crc kubenswrapper[4985]: I0224 10:12:18.476454 4985 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 24 10:12:18 crc kubenswrapper[4985]: E0224 10:12:18.476657 4985 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-24 10:12:18.976631793 +0000 UTC m=+223.450824353 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 24 10:12:18 crc kubenswrapper[4985]: I0224 10:12:18.477025 4985 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ggvx6\" (UID: \"6d51aec2-2b30-49d0-8aea-019bea882940\") " pod="openshift-image-registry/image-registry-697d97f7c8-ggvx6" Feb 24 10:12:18 crc kubenswrapper[4985]: E0224 10:12:18.477330 4985 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-24 10:12:18.977318784 +0000 UTC m=+223.451511344 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ggvx6" (UID: "6d51aec2-2b30-49d0-8aea-019bea882940") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 24 10:12:18 crc kubenswrapper[4985]: I0224 10:12:18.506038 4985 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-w5s2c" event={"ID":"492804d6-fccf-4710-b64f-aef85fa80b2d","Type":"ContainerStarted","Data":"9d7497624888f2fd5d8ff379aa5ea69c5890d9a5d1af9ec371f210b82066948a"} Feb 24 10:12:18 crc kubenswrapper[4985]: I0224 10:12:18.510461 4985 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-7fd5bb98f6-xk6s7" event={"ID":"b7825dac-b973-4913-bf15-eb8fc3fe6236","Type":"ContainerStarted","Data":"4d31b00cfb765cd97b59b6002153ed8950bbc7226dbf7476eacb407215e52cb5"} Feb 24 10:12:18 crc kubenswrapper[4985]: I0224 10:12:18.510521 4985 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-7fd5bb98f6-xk6s7" event={"ID":"b7825dac-b973-4913-bf15-eb8fc3fe6236","Type":"ContainerStarted","Data":"ce3b9082ca39488e29ea0b35955742a6353c76102255c8ba68e8ca36f8ff49fe"} Feb 24 10:12:18 crc kubenswrapper[4985]: I0224 10:12:18.510801 4985 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-7fd5bb98f6-xk6s7" Feb 24 10:12:18 crc kubenswrapper[4985]: I0224 10:12:18.512267 4985 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-n5mkh" event={"ID":"7b4ae5a0-ee86-4518-b754-4b57da2dc152","Type":"ContainerStarted","Data":"806085f8db590ec6c9059ee0a237984dd0a60e9b2ee3ee82c53e19576e5cce60"} Feb 24 10:12:18 crc kubenswrapper[4985]: I0224 10:12:18.514167 4985 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-ddbfd44ff-n5gs7" event={"ID":"86bcd109-4d8a-427f-9c07-1e88c6e1ea22","Type":"ContainerStarted","Data":"a67942c5969e081d5a0c4e407678a8ccecd962b6e84b232616bd30eaed529c2d"} Feb 24 10:12:18 crc kubenswrapper[4985]: I0224 10:12:18.514729 4985 patch_prober.go:28] interesting pod/downloads-7954f5f757-zjjx8 container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.14:8080/\": dial tcp 10.217.0.14:8080: connect: connection refused" start-of-body= Feb 24 10:12:18 crc kubenswrapper[4985]: I0224 10:12:18.514844 4985 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-zjjx8" podUID="bce7d3e8-b855-43ae-8527-fbc14ac50521" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.14:8080/\": dial tcp 10.217.0.14:8080: connect: connection refused" Feb 24 10:12:18 crc kubenswrapper[4985]: I0224 10:12:18.518265 4985 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-rlq9j container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.33:8080/healthz\": dial tcp 10.217.0.33:8080: connect: connection refused" start-of-body= Feb 24 10:12:18 crc kubenswrapper[4985]: I0224 10:12:18.518302 4985 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-rlq9j" podUID="6ef771ef-28ac-46c6-925e-f12a7a70b6c3" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.33:8080/healthz\": dial tcp 10.217.0.33:8080: connect: connection refused" Feb 24 10:12:18 crc kubenswrapper[4985]: I0224 10:12:18.518352 4985 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-7fd5bb98f6-xk6s7" Feb 24 10:12:18 crc kubenswrapper[4985]: I0224 10:12:18.534813 4985 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-7fd5bb98f6-xk6s7" podStartSLOduration=3.534797442 podStartE2EDuration="3.534797442s" podCreationTimestamp="2026-02-24 10:12:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-24 10:12:18.533935876 +0000 UTC m=+223.008128436" watchObservedRunningTime="2026-02-24 10:12:18.534797442 +0000 UTC m=+223.008990002" Feb 24 10:12:18 crc kubenswrapper[4985]: I0224 10:12:18.563881 4985 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-mvvxj" Feb 24 10:12:18 crc kubenswrapper[4985]: I0224 10:12:18.578447 4985 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 24 10:12:18 crc kubenswrapper[4985]: E0224 10:12:18.578635 4985 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-24 10:12:19.078610773 +0000 UTC m=+223.552803333 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 24 10:12:18 crc kubenswrapper[4985]: I0224 10:12:18.579487 4985 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ggvx6\" (UID: \"6d51aec2-2b30-49d0-8aea-019bea882940\") " pod="openshift-image-registry/image-registry-697d97f7c8-ggvx6" Feb 24 10:12:18 crc kubenswrapper[4985]: E0224 10:12:18.580403 4985 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-24 10:12:19.080391197 +0000 UTC m=+223.554583757 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ggvx6" (UID: "6d51aec2-2b30-49d0-8aea-019bea882940") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 24 10:12:18 crc kubenswrapper[4985]: I0224 10:12:18.682666 4985 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 24 10:12:18 crc kubenswrapper[4985]: E0224 10:12:18.682821 4985 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-24 10:12:19.182804839 +0000 UTC m=+223.656997399 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 24 10:12:18 crc kubenswrapper[4985]: I0224 10:12:18.682856 4985 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ggvx6\" (UID: \"6d51aec2-2b30-49d0-8aea-019bea882940\") " pod="openshift-image-registry/image-registry-697d97f7c8-ggvx6" Feb 24 10:12:18 crc kubenswrapper[4985]: E0224 10:12:18.683161 4985 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-24 10:12:19.183150839 +0000 UTC m=+223.657343399 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ggvx6" (UID: "6d51aec2-2b30-49d0-8aea-019bea882940") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 24 10:12:18 crc kubenswrapper[4985]: I0224 10:12:18.737490 4985 patch_prober.go:28] interesting pod/router-default-5444994796-gcwfl container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 24 10:12:18 crc kubenswrapper[4985]: [-]has-synced failed: reason withheld Feb 24 10:12:18 crc kubenswrapper[4985]: [+]process-running ok Feb 24 10:12:18 crc kubenswrapper[4985]: healthz check failed Feb 24 10:12:18 crc kubenswrapper[4985]: I0224 10:12:18.737550 4985 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-gcwfl" podUID="4cc3b979-d591-4106-8874-760925fd10f6" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 24 10:12:18 crc kubenswrapper[4985]: I0224 10:12:18.784010 4985 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 24 10:12:18 crc kubenswrapper[4985]: E0224 10:12:18.784309 4985 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-24 10:12:19.284282753 +0000 UTC m=+223.758475313 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 24 10:12:18 crc kubenswrapper[4985]: I0224 10:12:18.885712 4985 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ggvx6\" (UID: \"6d51aec2-2b30-49d0-8aea-019bea882940\") " pod="openshift-image-registry/image-registry-697d97f7c8-ggvx6" Feb 24 10:12:18 crc kubenswrapper[4985]: E0224 10:12:18.886333 4985 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-24 10:12:19.386316944 +0000 UTC m=+223.860509504 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ggvx6" (UID: "6d51aec2-2b30-49d0-8aea-019bea882940") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 24 10:12:18 crc kubenswrapper[4985]: I0224 10:12:18.978076 4985 ???:1] "http: TLS handshake error from 192.168.126.11:52714: no serving certificate available for the kubelet" Feb 24 10:12:18 crc kubenswrapper[4985]: I0224 10:12:18.987399 4985 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 24 10:12:18 crc kubenswrapper[4985]: E0224 10:12:18.987574 4985 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-24 10:12:19.48754688 +0000 UTC m=+223.961739440 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 24 10:12:18 crc kubenswrapper[4985]: I0224 10:12:18.987751 4985 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ggvx6\" (UID: \"6d51aec2-2b30-49d0-8aea-019bea882940\") " pod="openshift-image-registry/image-registry-697d97f7c8-ggvx6" Feb 24 10:12:18 crc kubenswrapper[4985]: E0224 10:12:18.988145 4985 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-24 10:12:19.488136338 +0000 UTC m=+223.962328888 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ggvx6" (UID: "6d51aec2-2b30-49d0-8aea-019bea882940") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 24 10:12:19 crc kubenswrapper[4985]: I0224 10:12:19.038704 4985 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-cfp92" Feb 24 10:12:19 crc kubenswrapper[4985]: E0224 10:12:19.089187 4985 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-24 10:12:19.589170448 +0000 UTC m=+224.063363008 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 24 10:12:19 crc kubenswrapper[4985]: I0224 10:12:19.089208 4985 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 24 10:12:19 crc kubenswrapper[4985]: I0224 10:12:19.089536 4985 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ggvx6\" (UID: \"6d51aec2-2b30-49d0-8aea-019bea882940\") " pod="openshift-image-registry/image-registry-697d97f7c8-ggvx6" Feb 24 10:12:19 crc kubenswrapper[4985]: E0224 10:12:19.089904 4985 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-24 10:12:19.58988035 +0000 UTC m=+224.064072910 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ggvx6" (UID: "6d51aec2-2b30-49d0-8aea-019bea882940") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 24 10:12:19 crc kubenswrapper[4985]: I0224 10:12:19.188096 4985 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-lpjfw"] Feb 24 10:12:19 crc kubenswrapper[4985]: I0224 10:12:19.188962 4985 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-lpjfw" Feb 24 10:12:19 crc kubenswrapper[4985]: I0224 10:12:19.190841 4985 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 24 10:12:19 crc kubenswrapper[4985]: E0224 10:12:19.191288 4985 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-24 10:12:19.691272581 +0000 UTC m=+224.165465141 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 24 10:12:19 crc kubenswrapper[4985]: I0224 10:12:19.192758 4985 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Feb 24 10:12:19 crc kubenswrapper[4985]: I0224 10:12:19.222261 4985 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-lpjfw"] Feb 24 10:12:19 crc kubenswrapper[4985]: I0224 10:12:19.292808 4985 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ggvx6\" (UID: \"6d51aec2-2b30-49d0-8aea-019bea882940\") " pod="openshift-image-registry/image-registry-697d97f7c8-ggvx6" Feb 24 10:12:19 crc kubenswrapper[4985]: I0224 10:12:19.292856 4985 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fef137fa-cf9a-4695-a0b5-3863ec2ea3bf-catalog-content\") pod \"certified-operators-lpjfw\" (UID: \"fef137fa-cf9a-4695-a0b5-3863ec2ea3bf\") " pod="openshift-marketplace/certified-operators-lpjfw" Feb 24 10:12:19 crc kubenswrapper[4985]: I0224 10:12:19.292876 4985 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fef137fa-cf9a-4695-a0b5-3863ec2ea3bf-utilities\") pod \"certified-operators-lpjfw\" (UID: \"fef137fa-cf9a-4695-a0b5-3863ec2ea3bf\") " pod="openshift-marketplace/certified-operators-lpjfw" Feb 24 10:12:19 crc kubenswrapper[4985]: I0224 10:12:19.292973 4985 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mp2f6\" (UniqueName: \"kubernetes.io/projected/fef137fa-cf9a-4695-a0b5-3863ec2ea3bf-kube-api-access-mp2f6\") pod \"certified-operators-lpjfw\" (UID: \"fef137fa-cf9a-4695-a0b5-3863ec2ea3bf\") " pod="openshift-marketplace/certified-operators-lpjfw" Feb 24 10:12:19 crc kubenswrapper[4985]: E0224 10:12:19.293333 4985 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-24 10:12:19.793318792 +0000 UTC m=+224.267511352 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ggvx6" (UID: "6d51aec2-2b30-49d0-8aea-019bea882940") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 24 10:12:19 crc kubenswrapper[4985]: I0224 10:12:19.388107 4985 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-l5p66"] Feb 24 10:12:19 crc kubenswrapper[4985]: I0224 10:12:19.389077 4985 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-l5p66" Feb 24 10:12:19 crc kubenswrapper[4985]: I0224 10:12:19.391118 4985 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Feb 24 10:12:19 crc kubenswrapper[4985]: I0224 10:12:19.393850 4985 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 24 10:12:19 crc kubenswrapper[4985]: I0224 10:12:19.394113 4985 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mp2f6\" (UniqueName: \"kubernetes.io/projected/fef137fa-cf9a-4695-a0b5-3863ec2ea3bf-kube-api-access-mp2f6\") pod \"certified-operators-lpjfw\" (UID: \"fef137fa-cf9a-4695-a0b5-3863ec2ea3bf\") " pod="openshift-marketplace/certified-operators-lpjfw" Feb 24 10:12:19 crc kubenswrapper[4985]: I0224 10:12:19.394203 4985 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fef137fa-cf9a-4695-a0b5-3863ec2ea3bf-catalog-content\") pod \"certified-operators-lpjfw\" (UID: \"fef137fa-cf9a-4695-a0b5-3863ec2ea3bf\") " pod="openshift-marketplace/certified-operators-lpjfw" Feb 24 10:12:19 crc kubenswrapper[4985]: I0224 10:12:19.394224 4985 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fef137fa-cf9a-4695-a0b5-3863ec2ea3bf-utilities\") pod \"certified-operators-lpjfw\" (UID: \"fef137fa-cf9a-4695-a0b5-3863ec2ea3bf\") " pod="openshift-marketplace/certified-operators-lpjfw" Feb 24 10:12:19 crc kubenswrapper[4985]: I0224 10:12:19.394840 4985 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fef137fa-cf9a-4695-a0b5-3863ec2ea3bf-utilities\") pod \"certified-operators-lpjfw\" (UID: \"fef137fa-cf9a-4695-a0b5-3863ec2ea3bf\") " pod="openshift-marketplace/certified-operators-lpjfw" Feb 24 10:12:19 crc kubenswrapper[4985]: E0224 10:12:19.394929 4985 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-24 10:12:19.89491416 +0000 UTC m=+224.369106720 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 24 10:12:19 crc kubenswrapper[4985]: I0224 10:12:19.395297 4985 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fef137fa-cf9a-4695-a0b5-3863ec2ea3bf-catalog-content\") pod \"certified-operators-lpjfw\" (UID: \"fef137fa-cf9a-4695-a0b5-3863ec2ea3bf\") " pod="openshift-marketplace/certified-operators-lpjfw" Feb 24 10:12:19 crc kubenswrapper[4985]: I0224 10:12:19.418723 4985 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mp2f6\" (UniqueName: \"kubernetes.io/projected/fef137fa-cf9a-4695-a0b5-3863ec2ea3bf-kube-api-access-mp2f6\") pod \"certified-operators-lpjfw\" (UID: \"fef137fa-cf9a-4695-a0b5-3863ec2ea3bf\") " pod="openshift-marketplace/certified-operators-lpjfw" Feb 24 10:12:19 crc kubenswrapper[4985]: I0224 10:12:19.490554 4985 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-l5p66"] Feb 24 10:12:19 crc kubenswrapper[4985]: I0224 10:12:19.495483 4985 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5186b86d-7d8f-4ed5-b444-991eaf2a793e-utilities\") pod \"community-operators-l5p66\" (UID: \"5186b86d-7d8f-4ed5-b444-991eaf2a793e\") " pod="openshift-marketplace/community-operators-l5p66" Feb 24 10:12:19 crc kubenswrapper[4985]: I0224 10:12:19.495548 4985 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jrwz6\" (UniqueName: \"kubernetes.io/projected/5186b86d-7d8f-4ed5-b444-991eaf2a793e-kube-api-access-jrwz6\") pod \"community-operators-l5p66\" (UID: \"5186b86d-7d8f-4ed5-b444-991eaf2a793e\") " pod="openshift-marketplace/community-operators-l5p66" Feb 24 10:12:19 crc kubenswrapper[4985]: I0224 10:12:19.495594 4985 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5186b86d-7d8f-4ed5-b444-991eaf2a793e-catalog-content\") pod \"community-operators-l5p66\" (UID: \"5186b86d-7d8f-4ed5-b444-991eaf2a793e\") " pod="openshift-marketplace/community-operators-l5p66" Feb 24 10:12:19 crc kubenswrapper[4985]: I0224 10:12:19.495633 4985 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ggvx6\" (UID: \"6d51aec2-2b30-49d0-8aea-019bea882940\") " pod="openshift-image-registry/image-registry-697d97f7c8-ggvx6" Feb 24 10:12:19 crc kubenswrapper[4985]: E0224 10:12:19.495925 4985 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-24 10:12:19.995913889 +0000 UTC m=+224.470106449 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ggvx6" (UID: "6d51aec2-2b30-49d0-8aea-019bea882940") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 24 10:12:19 crc kubenswrapper[4985]: I0224 10:12:19.507018 4985 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-lpjfw" Feb 24 10:12:19 crc kubenswrapper[4985]: I0224 10:12:19.525739 4985 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-n5mkh" event={"ID":"7b4ae5a0-ee86-4518-b754-4b57da2dc152","Type":"ContainerStarted","Data":"85c5348917f617a7905c124e88a44caaba2e4ace3a3ed07213df0e486206e379"} Feb 24 10:12:19 crc kubenswrapper[4985]: I0224 10:12:19.525782 4985 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-n5mkh" event={"ID":"7b4ae5a0-ee86-4518-b754-4b57da2dc152","Type":"ContainerStarted","Data":"db0c29478eb49dc2d7a14fff9ff6c4be21a0707762a09e57f512deba20d5bbd7"} Feb 24 10:12:19 crc kubenswrapper[4985]: I0224 10:12:19.536008 4985 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-ddbfd44ff-n5gs7" event={"ID":"86bcd109-4d8a-427f-9c07-1e88c6e1ea22","Type":"ContainerStarted","Data":"1a04a9fa1abe62c3dd7a539a0b7bea961f22b45e056bbfd47c27122efcb4e73b"} Feb 24 10:12:19 crc kubenswrapper[4985]: I0224 10:12:19.536588 4985 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-ddbfd44ff-n5gs7" Feb 24 10:12:19 crc kubenswrapper[4985]: I0224 10:12:19.536801 4985 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-rlq9j container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.33:8080/healthz\": dial tcp 10.217.0.33:8080: connect: connection refused" start-of-body= Feb 24 10:12:19 crc kubenswrapper[4985]: I0224 10:12:19.536846 4985 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-rlq9j" podUID="6ef771ef-28ac-46c6-925e-f12a7a70b6c3" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.33:8080/healthz\": dial tcp 10.217.0.33:8080: connect: connection refused" Feb 24 10:12:19 crc kubenswrapper[4985]: I0224 10:12:19.537157 4985 patch_prober.go:28] interesting pod/downloads-7954f5f757-zjjx8 container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.14:8080/\": dial tcp 10.217.0.14:8080: connect: connection refused" start-of-body= Feb 24 10:12:19 crc kubenswrapper[4985]: I0224 10:12:19.537202 4985 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-zjjx8" podUID="bce7d3e8-b855-43ae-8527-fbc14ac50521" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.14:8080/\": dial tcp 10.217.0.14:8080: connect: connection refused" Feb 24 10:12:19 crc kubenswrapper[4985]: I0224 10:12:19.544223 4985 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-ddbfd44ff-n5gs7" Feb 24 10:12:19 crc kubenswrapper[4985]: I0224 10:12:19.567392 4985 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-ddbfd44ff-n5gs7" podStartSLOduration=4.567377824 podStartE2EDuration="4.567377824s" podCreationTimestamp="2026-02-24 10:12:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-24 10:12:19.562150434 +0000 UTC m=+224.036342994" watchObservedRunningTime="2026-02-24 10:12:19.567377824 +0000 UTC m=+224.041570384" Feb 24 10:12:19 crc kubenswrapper[4985]: I0224 10:12:19.578131 4985 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-8k2ln"] Feb 24 10:12:19 crc kubenswrapper[4985]: I0224 10:12:19.578994 4985 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-8k2ln" Feb 24 10:12:19 crc kubenswrapper[4985]: I0224 10:12:19.596064 4985 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-8k2ln"] Feb 24 10:12:19 crc kubenswrapper[4985]: I0224 10:12:19.596498 4985 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 24 10:12:19 crc kubenswrapper[4985]: E0224 10:12:19.596651 4985 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-24 10:12:20.096628959 +0000 UTC m=+224.570821519 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 24 10:12:19 crc kubenswrapper[4985]: I0224 10:12:19.596699 4985 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ggvx6\" (UID: \"6d51aec2-2b30-49d0-8aea-019bea882940\") " pod="openshift-image-registry/image-registry-697d97f7c8-ggvx6" Feb 24 10:12:19 crc kubenswrapper[4985]: I0224 10:12:19.596828 4985 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5186b86d-7d8f-4ed5-b444-991eaf2a793e-utilities\") pod \"community-operators-l5p66\" (UID: \"5186b86d-7d8f-4ed5-b444-991eaf2a793e\") " pod="openshift-marketplace/community-operators-l5p66" Feb 24 10:12:19 crc kubenswrapper[4985]: I0224 10:12:19.596974 4985 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jrwz6\" (UniqueName: \"kubernetes.io/projected/5186b86d-7d8f-4ed5-b444-991eaf2a793e-kube-api-access-jrwz6\") pod \"community-operators-l5p66\" (UID: \"5186b86d-7d8f-4ed5-b444-991eaf2a793e\") " pod="openshift-marketplace/community-operators-l5p66" Feb 24 10:12:19 crc kubenswrapper[4985]: E0224 10:12:19.597169 4985 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-24 10:12:20.097151615 +0000 UTC m=+224.571344175 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ggvx6" (UID: "6d51aec2-2b30-49d0-8aea-019bea882940") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 24 10:12:19 crc kubenswrapper[4985]: I0224 10:12:19.597252 4985 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5186b86d-7d8f-4ed5-b444-991eaf2a793e-catalog-content\") pod \"community-operators-l5p66\" (UID: \"5186b86d-7d8f-4ed5-b444-991eaf2a793e\") " pod="openshift-marketplace/community-operators-l5p66" Feb 24 10:12:19 crc kubenswrapper[4985]: I0224 10:12:19.597866 4985 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5186b86d-7d8f-4ed5-b444-991eaf2a793e-utilities\") pod \"community-operators-l5p66\" (UID: \"5186b86d-7d8f-4ed5-b444-991eaf2a793e\") " pod="openshift-marketplace/community-operators-l5p66" Feb 24 10:12:19 crc kubenswrapper[4985]: I0224 10:12:19.598626 4985 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5186b86d-7d8f-4ed5-b444-991eaf2a793e-catalog-content\") pod \"community-operators-l5p66\" (UID: \"5186b86d-7d8f-4ed5-b444-991eaf2a793e\") " pod="openshift-marketplace/community-operators-l5p66" Feb 24 10:12:19 crc kubenswrapper[4985]: I0224 10:12:19.654062 4985 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jrwz6\" (UniqueName: \"kubernetes.io/projected/5186b86d-7d8f-4ed5-b444-991eaf2a793e-kube-api-access-jrwz6\") pod \"community-operators-l5p66\" (UID: \"5186b86d-7d8f-4ed5-b444-991eaf2a793e\") " pod="openshift-marketplace/community-operators-l5p66" Feb 24 10:12:19 crc kubenswrapper[4985]: I0224 10:12:19.698255 4985 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 24 10:12:19 crc kubenswrapper[4985]: I0224 10:12:19.698450 4985 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/49f9b37d-f90b-49ba-bb8d-bb34255c63c0-utilities\") pod \"certified-operators-8k2ln\" (UID: \"49f9b37d-f90b-49ba-bb8d-bb34255c63c0\") " pod="openshift-marketplace/certified-operators-8k2ln" Feb 24 10:12:19 crc kubenswrapper[4985]: I0224 10:12:19.698537 4985 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/49f9b37d-f90b-49ba-bb8d-bb34255c63c0-catalog-content\") pod \"certified-operators-8k2ln\" (UID: \"49f9b37d-f90b-49ba-bb8d-bb34255c63c0\") " pod="openshift-marketplace/certified-operators-8k2ln" Feb 24 10:12:19 crc kubenswrapper[4985]: I0224 10:12:19.698582 4985 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-49tf4\" (UniqueName: \"kubernetes.io/projected/49f9b37d-f90b-49ba-bb8d-bb34255c63c0-kube-api-access-49tf4\") pod \"certified-operators-8k2ln\" (UID: \"49f9b37d-f90b-49ba-bb8d-bb34255c63c0\") " pod="openshift-marketplace/certified-operators-8k2ln" Feb 24 10:12:19 crc kubenswrapper[4985]: E0224 10:12:19.698666 4985 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-24 10:12:20.19865253 +0000 UTC m=+224.672845090 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 24 10:12:19 crc kubenswrapper[4985]: I0224 10:12:19.711557 4985 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-l5p66" Feb 24 10:12:19 crc kubenswrapper[4985]: I0224 10:12:19.739631 4985 patch_prober.go:28] interesting pod/router-default-5444994796-gcwfl container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 24 10:12:19 crc kubenswrapper[4985]: [-]has-synced failed: reason withheld Feb 24 10:12:19 crc kubenswrapper[4985]: [+]process-running ok Feb 24 10:12:19 crc kubenswrapper[4985]: healthz check failed Feb 24 10:12:19 crc kubenswrapper[4985]: I0224 10:12:19.739678 4985 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-gcwfl" podUID="4cc3b979-d591-4106-8874-760925fd10f6" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 24 10:12:19 crc kubenswrapper[4985]: I0224 10:12:19.779757 4985 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-msgzd"] Feb 24 10:12:19 crc kubenswrapper[4985]: I0224 10:12:19.780861 4985 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-msgzd" Feb 24 10:12:19 crc kubenswrapper[4985]: I0224 10:12:19.794057 4985 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-msgzd"] Feb 24 10:12:19 crc kubenswrapper[4985]: I0224 10:12:19.799556 4985 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/49f9b37d-f90b-49ba-bb8d-bb34255c63c0-catalog-content\") pod \"certified-operators-8k2ln\" (UID: \"49f9b37d-f90b-49ba-bb8d-bb34255c63c0\") " pod="openshift-marketplace/certified-operators-8k2ln" Feb 24 10:12:19 crc kubenswrapper[4985]: I0224 10:12:19.799624 4985 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ggvx6\" (UID: \"6d51aec2-2b30-49d0-8aea-019bea882940\") " pod="openshift-image-registry/image-registry-697d97f7c8-ggvx6" Feb 24 10:12:19 crc kubenswrapper[4985]: I0224 10:12:19.799652 4985 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-49tf4\" (UniqueName: \"kubernetes.io/projected/49f9b37d-f90b-49ba-bb8d-bb34255c63c0-kube-api-access-49tf4\") pod \"certified-operators-8k2ln\" (UID: \"49f9b37d-f90b-49ba-bb8d-bb34255c63c0\") " pod="openshift-marketplace/certified-operators-8k2ln" Feb 24 10:12:19 crc kubenswrapper[4985]: I0224 10:12:19.799695 4985 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/49f9b37d-f90b-49ba-bb8d-bb34255c63c0-utilities\") pod \"certified-operators-8k2ln\" (UID: \"49f9b37d-f90b-49ba-bb8d-bb34255c63c0\") " pod="openshift-marketplace/certified-operators-8k2ln" Feb 24 10:12:19 crc kubenswrapper[4985]: E0224 10:12:19.799997 4985 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-24 10:12:20.299981719 +0000 UTC m=+224.774174279 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ggvx6" (UID: "6d51aec2-2b30-49d0-8aea-019bea882940") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 24 10:12:19 crc kubenswrapper[4985]: I0224 10:12:19.800604 4985 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/49f9b37d-f90b-49ba-bb8d-bb34255c63c0-catalog-content\") pod \"certified-operators-8k2ln\" (UID: \"49f9b37d-f90b-49ba-bb8d-bb34255c63c0\") " pod="openshift-marketplace/certified-operators-8k2ln" Feb 24 10:12:19 crc kubenswrapper[4985]: I0224 10:12:19.800690 4985 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/49f9b37d-f90b-49ba-bb8d-bb34255c63c0-utilities\") pod \"certified-operators-8k2ln\" (UID: \"49f9b37d-f90b-49ba-bb8d-bb34255c63c0\") " pod="openshift-marketplace/certified-operators-8k2ln" Feb 24 10:12:19 crc kubenswrapper[4985]: I0224 10:12:19.814259 4985 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-lpjfw"] Feb 24 10:12:19 crc kubenswrapper[4985]: I0224 10:12:19.814695 4985 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock" Feb 24 10:12:19 crc kubenswrapper[4985]: I0224 10:12:19.821962 4985 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-49tf4\" (UniqueName: \"kubernetes.io/projected/49f9b37d-f90b-49ba-bb8d-bb34255c63c0-kube-api-access-49tf4\") pod \"certified-operators-8k2ln\" (UID: \"49f9b37d-f90b-49ba-bb8d-bb34255c63c0\") " pod="openshift-marketplace/certified-operators-8k2ln" Feb 24 10:12:19 crc kubenswrapper[4985]: I0224 10:12:19.897291 4985 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-8k2ln" Feb 24 10:12:19 crc kubenswrapper[4985]: I0224 10:12:19.901385 4985 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 24 10:12:19 crc kubenswrapper[4985]: E0224 10:12:19.901587 4985 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-24 10:12:20.401558806 +0000 UTC m=+224.875751366 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 24 10:12:19 crc kubenswrapper[4985]: I0224 10:12:19.901682 4985 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/720477c9-8e44-43cf-a9ba-5ac5b96fe65b-utilities\") pod \"community-operators-msgzd\" (UID: \"720477c9-8e44-43cf-a9ba-5ac5b96fe65b\") " pod="openshift-marketplace/community-operators-msgzd" Feb 24 10:12:19 crc kubenswrapper[4985]: I0224 10:12:19.901879 4985 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4rqx4\" (UniqueName: \"kubernetes.io/projected/720477c9-8e44-43cf-a9ba-5ac5b96fe65b-kube-api-access-4rqx4\") pod \"community-operators-msgzd\" (UID: \"720477c9-8e44-43cf-a9ba-5ac5b96fe65b\") " pod="openshift-marketplace/community-operators-msgzd" Feb 24 10:12:19 crc kubenswrapper[4985]: I0224 10:12:19.901941 4985 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/720477c9-8e44-43cf-a9ba-5ac5b96fe65b-catalog-content\") pod \"community-operators-msgzd\" (UID: \"720477c9-8e44-43cf-a9ba-5ac5b96fe65b\") " pod="openshift-marketplace/community-operators-msgzd" Feb 24 10:12:19 crc kubenswrapper[4985]: I0224 10:12:19.901999 4985 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ggvx6\" (UID: \"6d51aec2-2b30-49d0-8aea-019bea882940\") " pod="openshift-image-registry/image-registry-697d97f7c8-ggvx6" Feb 24 10:12:19 crc kubenswrapper[4985]: E0224 10:12:19.902430 4985 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-24 10:12:20.402419092 +0000 UTC m=+224.876611652 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ggvx6" (UID: "6d51aec2-2b30-49d0-8aea-019bea882940") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 24 10:12:19 crc kubenswrapper[4985]: I0224 10:12:19.948670 4985 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-l5p66"] Feb 24 10:12:19 crc kubenswrapper[4985]: W0224 10:12:19.972395 4985 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5186b86d_7d8f_4ed5_b444_991eaf2a793e.slice/crio-c407de72161d712288ff2dbbd966ca1b6238e0b9bdc50c5086e8a7d35cd8bf94 WatchSource:0}: Error finding container c407de72161d712288ff2dbbd966ca1b6238e0b9bdc50c5086e8a7d35cd8bf94: Status 404 returned error can't find the container with id c407de72161d712288ff2dbbd966ca1b6238e0b9bdc50c5086e8a7d35cd8bf94 Feb 24 10:12:20 crc kubenswrapper[4985]: I0224 10:12:20.003465 4985 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 24 10:12:20 crc kubenswrapper[4985]: I0224 10:12:20.003682 4985 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4rqx4\" (UniqueName: \"kubernetes.io/projected/720477c9-8e44-43cf-a9ba-5ac5b96fe65b-kube-api-access-4rqx4\") pod \"community-operators-msgzd\" (UID: \"720477c9-8e44-43cf-a9ba-5ac5b96fe65b\") " pod="openshift-marketplace/community-operators-msgzd" Feb 24 10:12:20 crc kubenswrapper[4985]: I0224 10:12:20.003709 4985 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/720477c9-8e44-43cf-a9ba-5ac5b96fe65b-catalog-content\") pod \"community-operators-msgzd\" (UID: \"720477c9-8e44-43cf-a9ba-5ac5b96fe65b\") " pod="openshift-marketplace/community-operators-msgzd" Feb 24 10:12:20 crc kubenswrapper[4985]: I0224 10:12:20.003932 4985 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/720477c9-8e44-43cf-a9ba-5ac5b96fe65b-utilities\") pod \"community-operators-msgzd\" (UID: \"720477c9-8e44-43cf-a9ba-5ac5b96fe65b\") " pod="openshift-marketplace/community-operators-msgzd" Feb 24 10:12:20 crc kubenswrapper[4985]: I0224 10:12:20.004430 4985 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/720477c9-8e44-43cf-a9ba-5ac5b96fe65b-utilities\") pod \"community-operators-msgzd\" (UID: \"720477c9-8e44-43cf-a9ba-5ac5b96fe65b\") " pod="openshift-marketplace/community-operators-msgzd" Feb 24 10:12:20 crc kubenswrapper[4985]: E0224 10:12:20.004540 4985 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-24 10:12:20.504506924 +0000 UTC m=+224.978699484 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 24 10:12:20 crc kubenswrapper[4985]: I0224 10:12:20.006194 4985 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/720477c9-8e44-43cf-a9ba-5ac5b96fe65b-catalog-content\") pod \"community-operators-msgzd\" (UID: \"720477c9-8e44-43cf-a9ba-5ac5b96fe65b\") " pod="openshift-marketplace/community-operators-msgzd" Feb 24 10:12:20 crc kubenswrapper[4985]: I0224 10:12:20.023665 4985 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4rqx4\" (UniqueName: \"kubernetes.io/projected/720477c9-8e44-43cf-a9ba-5ac5b96fe65b-kube-api-access-4rqx4\") pod \"community-operators-msgzd\" (UID: \"720477c9-8e44-43cf-a9ba-5ac5b96fe65b\") " pod="openshift-marketplace/community-operators-msgzd" Feb 24 10:12:20 crc kubenswrapper[4985]: I0224 10:12:20.105247 4985 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ggvx6\" (UID: \"6d51aec2-2b30-49d0-8aea-019bea882940\") " pod="openshift-image-registry/image-registry-697d97f7c8-ggvx6" Feb 24 10:12:20 crc kubenswrapper[4985]: E0224 10:12:20.105536 4985 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-24 10:12:20.605523654 +0000 UTC m=+225.079716214 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ggvx6" (UID: "6d51aec2-2b30-49d0-8aea-019bea882940") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 24 10:12:20 crc kubenswrapper[4985]: I0224 10:12:20.105737 4985 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-msgzd" Feb 24 10:12:20 crc kubenswrapper[4985]: I0224 10:12:20.110764 4985 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-8k2ln"] Feb 24 10:12:20 crc kubenswrapper[4985]: I0224 10:12:20.208529 4985 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 24 10:12:20 crc kubenswrapper[4985]: E0224 10:12:20.208814 4985 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-24 10:12:20.708798603 +0000 UTC m=+225.182991163 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 24 10:12:20 crc kubenswrapper[4985]: I0224 10:12:20.312039 4985 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ggvx6\" (UID: \"6d51aec2-2b30-49d0-8aea-019bea882940\") " pod="openshift-image-registry/image-registry-697d97f7c8-ggvx6" Feb 24 10:12:20 crc kubenswrapper[4985]: E0224 10:12:20.312485 4985 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-24 10:12:20.812473413 +0000 UTC m=+225.286665973 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ggvx6" (UID: "6d51aec2-2b30-49d0-8aea-019bea882940") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 24 10:12:20 crc kubenswrapper[4985]: I0224 10:12:20.413514 4985 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 24 10:12:20 crc kubenswrapper[4985]: E0224 10:12:20.413608 4985 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-24 10:12:20.913589216 +0000 UTC m=+225.387781776 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 24 10:12:20 crc kubenswrapper[4985]: I0224 10:12:20.414034 4985 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ggvx6\" (UID: \"6d51aec2-2b30-49d0-8aea-019bea882940\") " pod="openshift-image-registry/image-registry-697d97f7c8-ggvx6" Feb 24 10:12:20 crc kubenswrapper[4985]: E0224 10:12:20.414315 4985 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-24 10:12:20.914307238 +0000 UTC m=+225.388499798 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ggvx6" (UID: "6d51aec2-2b30-49d0-8aea-019bea882940") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 24 10:12:20 crc kubenswrapper[4985]: I0224 10:12:20.479772 4985 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock","Timestamp":"2026-02-24T10:12:19.81472356Z","Handler":null,"Name":""} Feb 24 10:12:20 crc kubenswrapper[4985]: I0224 10:12:20.483229 4985 csi_plugin.go:100] kubernetes.io/csi: Trying to validate a new CSI Driver with name: kubevirt.io.hostpath-provisioner endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock versions: 1.0.0 Feb 24 10:12:20 crc kubenswrapper[4985]: I0224 10:12:20.483289 4985 csi_plugin.go:113] kubernetes.io/csi: Register new plugin with name: kubevirt.io.hostpath-provisioner at endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock Feb 24 10:12:20 crc kubenswrapper[4985]: I0224 10:12:20.504583 4985 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-msgzd"] Feb 24 10:12:20 crc kubenswrapper[4985]: W0224 10:12:20.513072 4985 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod720477c9_8e44_43cf_a9ba_5ac5b96fe65b.slice/crio-a1ce1f4c5e55dfee91352b4d210f10ae503899dc99eeeb8950de305d48c8f541 WatchSource:0}: Error finding container a1ce1f4c5e55dfee91352b4d210f10ae503899dc99eeeb8950de305d48c8f541: Status 404 returned error can't find the container with id a1ce1f4c5e55dfee91352b4d210f10ae503899dc99eeeb8950de305d48c8f541 Feb 24 10:12:20 crc kubenswrapper[4985]: I0224 10:12:20.515059 4985 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 24 10:12:20 crc kubenswrapper[4985]: I0224 10:12:20.519947 4985 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Feb 24 10:12:20 crc kubenswrapper[4985]: I0224 10:12:20.540018 4985 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-n5mkh" event={"ID":"7b4ae5a0-ee86-4518-b754-4b57da2dc152","Type":"ContainerStarted","Data":"cce14b8179c5f9ac8b16fa6e177381c93eab024b857e2afb6f8c41b79950587f"} Feb 24 10:12:20 crc kubenswrapper[4985]: I0224 10:12:20.541987 4985 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-msgzd" event={"ID":"720477c9-8e44-43cf-a9ba-5ac5b96fe65b","Type":"ContainerStarted","Data":"a1ce1f4c5e55dfee91352b4d210f10ae503899dc99eeeb8950de305d48c8f541"} Feb 24 10:12:20 crc kubenswrapper[4985]: I0224 10:12:20.543412 4985 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8k2ln" event={"ID":"49f9b37d-f90b-49ba-bb8d-bb34255c63c0","Type":"ContainerDied","Data":"08ca263dc71769f8f24ce540adc52274a5a7dc92e3be740cd8fd1080cab15145"} Feb 24 10:12:20 crc kubenswrapper[4985]: I0224 10:12:20.543401 4985 generic.go:334] "Generic (PLEG): container finished" podID="49f9b37d-f90b-49ba-bb8d-bb34255c63c0" containerID="08ca263dc71769f8f24ce540adc52274a5a7dc92e3be740cd8fd1080cab15145" exitCode=0 Feb 24 10:12:20 crc kubenswrapper[4985]: I0224 10:12:20.543640 4985 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8k2ln" event={"ID":"49f9b37d-f90b-49ba-bb8d-bb34255c63c0","Type":"ContainerStarted","Data":"2d4b218c86ae22c435eab27cb0afca6521b0fd7b35d2817159e919ecf4296500"} Feb 24 10:12:20 crc kubenswrapper[4985]: I0224 10:12:20.545962 4985 generic.go:334] "Generic (PLEG): container finished" podID="fef137fa-cf9a-4695-a0b5-3863ec2ea3bf" containerID="f748cadb2c14af032b903bc46576a2f812070585caecce417f7d6f2d5453c570" exitCode=0 Feb 24 10:12:20 crc kubenswrapper[4985]: I0224 10:12:20.546007 4985 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-lpjfw" event={"ID":"fef137fa-cf9a-4695-a0b5-3863ec2ea3bf","Type":"ContainerDied","Data":"f748cadb2c14af032b903bc46576a2f812070585caecce417f7d6f2d5453c570"} Feb 24 10:12:20 crc kubenswrapper[4985]: I0224 10:12:20.546022 4985 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-lpjfw" event={"ID":"fef137fa-cf9a-4695-a0b5-3863ec2ea3bf","Type":"ContainerStarted","Data":"e2e9add4855b0e37590ea0f1d33c498f6b79987a891a5f60c7afa3e8c22742fc"} Feb 24 10:12:20 crc kubenswrapper[4985]: I0224 10:12:20.547071 4985 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 24 10:12:20 crc kubenswrapper[4985]: I0224 10:12:20.551535 4985 generic.go:334] "Generic (PLEG): container finished" podID="2364f27b-fc75-4e56-8122-d6bdcc763b0a" containerID="62739b6d86536f8931eb7e7b691c2a9f9d59ebcfe45b7f34cb1fb931de68a641" exitCode=0 Feb 24 10:12:20 crc kubenswrapper[4985]: I0224 10:12:20.551601 4985 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29532120-q5v4c" event={"ID":"2364f27b-fc75-4e56-8122-d6bdcc763b0a","Type":"ContainerDied","Data":"62739b6d86536f8931eb7e7b691c2a9f9d59ebcfe45b7f34cb1fb931de68a641"} Feb 24 10:12:20 crc kubenswrapper[4985]: I0224 10:12:20.553464 4985 generic.go:334] "Generic (PLEG): container finished" podID="5186b86d-7d8f-4ed5-b444-991eaf2a793e" containerID="8afe751e1cca698fb0d943baa943410c58c42409ca1e1eaa81771f5e6aaaa4bd" exitCode=0 Feb 24 10:12:20 crc kubenswrapper[4985]: I0224 10:12:20.553580 4985 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-l5p66" event={"ID":"5186b86d-7d8f-4ed5-b444-991eaf2a793e","Type":"ContainerDied","Data":"8afe751e1cca698fb0d943baa943410c58c42409ca1e1eaa81771f5e6aaaa4bd"} Feb 24 10:12:20 crc kubenswrapper[4985]: I0224 10:12:20.553634 4985 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-l5p66" event={"ID":"5186b86d-7d8f-4ed5-b444-991eaf2a793e","Type":"ContainerStarted","Data":"c407de72161d712288ff2dbbd966ca1b6238e0b9bdc50c5086e8a7d35cd8bf94"} Feb 24 10:12:20 crc kubenswrapper[4985]: I0224 10:12:20.568929 4985 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="hostpath-provisioner/csi-hostpathplugin-n5mkh" podStartSLOduration=10.568910717 podStartE2EDuration="10.568910717s" podCreationTimestamp="2026-02-24 10:12:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-24 10:12:20.566968018 +0000 UTC m=+225.041160598" watchObservedRunningTime="2026-02-24 10:12:20.568910717 +0000 UTC m=+225.043103297" Feb 24 10:12:20 crc kubenswrapper[4985]: I0224 10:12:20.616084 4985 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ggvx6\" (UID: \"6d51aec2-2b30-49d0-8aea-019bea882940\") " pod="openshift-image-registry/image-registry-697d97f7c8-ggvx6" Feb 24 10:12:20 crc kubenswrapper[4985]: I0224 10:12:20.627603 4985 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Feb 24 10:12:20 crc kubenswrapper[4985]: I0224 10:12:20.627957 4985 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ggvx6\" (UID: \"6d51aec2-2b30-49d0-8aea-019bea882940\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount\"" pod="openshift-image-registry/image-registry-697d97f7c8-ggvx6" Feb 24 10:12:20 crc kubenswrapper[4985]: I0224 10:12:20.629398 4985 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-f9d7485db-d7v4s" Feb 24 10:12:20 crc kubenswrapper[4985]: I0224 10:12:20.629809 4985 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-f9d7485db-d7v4s" Feb 24 10:12:20 crc kubenswrapper[4985]: I0224 10:12:20.633872 4985 patch_prober.go:28] interesting pod/console-f9d7485db-d7v4s container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.8:8443/health\": dial tcp 10.217.0.8:8443: connect: connection refused" start-of-body= Feb 24 10:12:20 crc kubenswrapper[4985]: I0224 10:12:20.633936 4985 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-d7v4s" podUID="8c5e6fc2-42c9-4794-91cc-1f74adf686db" containerName="console" probeResult="failure" output="Get \"https://10.217.0.8:8443/health\": dial tcp 10.217.0.8:8443: connect: connection refused" Feb 24 10:12:20 crc kubenswrapper[4985]: I0224 10:12:20.657190 4985 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ggvx6\" (UID: \"6d51aec2-2b30-49d0-8aea-019bea882940\") " pod="openshift-image-registry/image-registry-697d97f7c8-ggvx6" Feb 24 10:12:20 crc kubenswrapper[4985]: I0224 10:12:20.738017 4985 patch_prober.go:28] interesting pod/router-default-5444994796-gcwfl container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 24 10:12:20 crc kubenswrapper[4985]: [-]has-synced failed: reason withheld Feb 24 10:12:20 crc kubenswrapper[4985]: [+]process-running ok Feb 24 10:12:20 crc kubenswrapper[4985]: healthz check failed Feb 24 10:12:20 crc kubenswrapper[4985]: I0224 10:12:20.738097 4985 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-gcwfl" podUID="4cc3b979-d591-4106-8874-760925fd10f6" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 24 10:12:20 crc kubenswrapper[4985]: I0224 10:12:20.760122 4985 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-config-operator/openshift-config-operator-7777fb866f-n8mff" Feb 24 10:12:20 crc kubenswrapper[4985]: I0224 10:12:20.912156 4985 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-ggvx6" Feb 24 10:12:21 crc kubenswrapper[4985]: I0224 10:12:21.151185 4985 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-ggvx6"] Feb 24 10:12:21 crc kubenswrapper[4985]: I0224 10:12:21.175977 4985 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-ptllg"] Feb 24 10:12:21 crc kubenswrapper[4985]: I0224 10:12:21.176922 4985 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-ptllg" Feb 24 10:12:21 crc kubenswrapper[4985]: I0224 10:12:21.181961 4985 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Feb 24 10:12:21 crc kubenswrapper[4985]: I0224 10:12:21.188202 4985 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-ptllg"] Feb 24 10:12:21 crc kubenswrapper[4985]: I0224 10:12:21.224499 4985 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 24 10:12:21 crc kubenswrapper[4985]: I0224 10:12:21.224570 4985 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 24 10:12:21 crc kubenswrapper[4985]: I0224 10:12:21.224646 4985 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 24 10:12:21 crc kubenswrapper[4985]: I0224 10:12:21.225551 4985 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 24 10:12:21 crc kubenswrapper[4985]: I0224 10:12:21.229855 4985 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 24 10:12:21 crc kubenswrapper[4985]: I0224 10:12:21.230764 4985 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 24 10:12:21 crc kubenswrapper[4985]: I0224 10:12:21.325820 4985 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d8f7b4e0-a896-4c14-bfea-fa6066bfd4c4-utilities\") pod \"redhat-marketplace-ptllg\" (UID: \"d8f7b4e0-a896-4c14-bfea-fa6066bfd4c4\") " pod="openshift-marketplace/redhat-marketplace-ptllg" Feb 24 10:12:21 crc kubenswrapper[4985]: I0224 10:12:21.325916 4985 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6s6dj\" (UniqueName: \"kubernetes.io/projected/d8f7b4e0-a896-4c14-bfea-fa6066bfd4c4-kube-api-access-6s6dj\") pod \"redhat-marketplace-ptllg\" (UID: \"d8f7b4e0-a896-4c14-bfea-fa6066bfd4c4\") " pod="openshift-marketplace/redhat-marketplace-ptllg" Feb 24 10:12:21 crc kubenswrapper[4985]: I0224 10:12:21.325958 4985 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d4340d1a-60cb-4240-87ba-1e468c9c41cf-metrics-certs\") pod \"network-metrics-daemon-xkc65\" (UID: \"d4340d1a-60cb-4240-87ba-1e468c9c41cf\") " pod="openshift-multus/network-metrics-daemon-xkc65" Feb 24 10:12:21 crc kubenswrapper[4985]: I0224 10:12:21.325993 4985 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d8f7b4e0-a896-4c14-bfea-fa6066bfd4c4-catalog-content\") pod \"redhat-marketplace-ptllg\" (UID: \"d8f7b4e0-a896-4c14-bfea-fa6066bfd4c4\") " pod="openshift-marketplace/redhat-marketplace-ptllg" Feb 24 10:12:21 crc kubenswrapper[4985]: I0224 10:12:21.326018 4985 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 24 10:12:21 crc kubenswrapper[4985]: I0224 10:12:21.329557 4985 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d4340d1a-60cb-4240-87ba-1e468c9c41cf-metrics-certs\") pod \"network-metrics-daemon-xkc65\" (UID: \"d4340d1a-60cb-4240-87ba-1e468c9c41cf\") " pod="openshift-multus/network-metrics-daemon-xkc65" Feb 24 10:12:21 crc kubenswrapper[4985]: I0224 10:12:21.329759 4985 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 24 10:12:21 crc kubenswrapper[4985]: I0224 10:12:21.426561 4985 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6s6dj\" (UniqueName: \"kubernetes.io/projected/d8f7b4e0-a896-4c14-bfea-fa6066bfd4c4-kube-api-access-6s6dj\") pod \"redhat-marketplace-ptllg\" (UID: \"d8f7b4e0-a896-4c14-bfea-fa6066bfd4c4\") " pod="openshift-marketplace/redhat-marketplace-ptllg" Feb 24 10:12:21 crc kubenswrapper[4985]: I0224 10:12:21.426627 4985 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d8f7b4e0-a896-4c14-bfea-fa6066bfd4c4-catalog-content\") pod \"redhat-marketplace-ptllg\" (UID: \"d8f7b4e0-a896-4c14-bfea-fa6066bfd4c4\") " pod="openshift-marketplace/redhat-marketplace-ptllg" Feb 24 10:12:21 crc kubenswrapper[4985]: I0224 10:12:21.426665 4985 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d8f7b4e0-a896-4c14-bfea-fa6066bfd4c4-utilities\") pod \"redhat-marketplace-ptllg\" (UID: \"d8f7b4e0-a896-4c14-bfea-fa6066bfd4c4\") " pod="openshift-marketplace/redhat-marketplace-ptllg" Feb 24 10:12:21 crc kubenswrapper[4985]: I0224 10:12:21.427068 4985 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d8f7b4e0-a896-4c14-bfea-fa6066bfd4c4-utilities\") pod \"redhat-marketplace-ptllg\" (UID: \"d8f7b4e0-a896-4c14-bfea-fa6066bfd4c4\") " pod="openshift-marketplace/redhat-marketplace-ptllg" Feb 24 10:12:21 crc kubenswrapper[4985]: I0224 10:12:21.427651 4985 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d8f7b4e0-a896-4c14-bfea-fa6066bfd4c4-catalog-content\") pod \"redhat-marketplace-ptllg\" (UID: \"d8f7b4e0-a896-4c14-bfea-fa6066bfd4c4\") " pod="openshift-marketplace/redhat-marketplace-ptllg" Feb 24 10:12:21 crc kubenswrapper[4985]: I0224 10:12:21.427969 4985 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xkc65" Feb 24 10:12:21 crc kubenswrapper[4985]: I0224 10:12:21.444020 4985 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 24 10:12:21 crc kubenswrapper[4985]: I0224 10:12:21.444840 4985 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6s6dj\" (UniqueName: \"kubernetes.io/projected/d8f7b4e0-a896-4c14-bfea-fa6066bfd4c4-kube-api-access-6s6dj\") pod \"redhat-marketplace-ptllg\" (UID: \"d8f7b4e0-a896-4c14-bfea-fa6066bfd4c4\") " pod="openshift-marketplace/redhat-marketplace-ptllg" Feb 24 10:12:21 crc kubenswrapper[4985]: I0224 10:12:21.453792 4985 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 24 10:12:21 crc kubenswrapper[4985]: I0224 10:12:21.462705 4985 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 24 10:12:21 crc kubenswrapper[4985]: I0224 10:12:21.542151 4985 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-ptllg" Feb 24 10:12:21 crc kubenswrapper[4985]: I0224 10:12:21.559620 4985 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-apiserver/apiserver-76f77b778f-nnr7f" Feb 24 10:12:21 crc kubenswrapper[4985]: I0224 10:12:21.559656 4985 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-apiserver/apiserver-76f77b778f-nnr7f" Feb 24 10:12:21 crc kubenswrapper[4985]: I0224 10:12:21.574095 4985 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-apiserver/apiserver-76f77b778f-nnr7f" Feb 24 10:12:21 crc kubenswrapper[4985]: I0224 10:12:21.578737 4985 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-f9svt"] Feb 24 10:12:21 crc kubenswrapper[4985]: I0224 10:12:21.579687 4985 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-f9svt" Feb 24 10:12:21 crc kubenswrapper[4985]: I0224 10:12:21.588848 4985 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-f9svt"] Feb 24 10:12:21 crc kubenswrapper[4985]: I0224 10:12:21.615227 4985 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-ggvx6" event={"ID":"6d51aec2-2b30-49d0-8aea-019bea882940","Type":"ContainerStarted","Data":"ca0a9a8cf47f414e0fb5023920285a538c942ba1d94d72baf956b0cc92c87a09"} Feb 24 10:12:21 crc kubenswrapper[4985]: I0224 10:12:21.615294 4985 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-ggvx6" event={"ID":"6d51aec2-2b30-49d0-8aea-019bea882940","Type":"ContainerStarted","Data":"25d4eb64fe3c05fdae5b8092902a2b51b4d639f6b6465ab7c6bb6eb3a8ff8b59"} Feb 24 10:12:21 crc kubenswrapper[4985]: I0224 10:12:21.615414 4985 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-697d97f7c8-ggvx6" Feb 24 10:12:21 crc kubenswrapper[4985]: I0224 10:12:21.655926 4985 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-697d97f7c8-ggvx6" podStartSLOduration=179.655912024 podStartE2EDuration="2m59.655912024s" podCreationTimestamp="2026-02-24 10:09:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-24 10:12:21.655356786 +0000 UTC m=+226.129549346" watchObservedRunningTime="2026-02-24 10:12:21.655912024 +0000 UTC m=+226.130104584" Feb 24 10:12:21 crc kubenswrapper[4985]: I0224 10:12:21.656974 4985 generic.go:334] "Generic (PLEG): container finished" podID="720477c9-8e44-43cf-a9ba-5ac5b96fe65b" containerID="e981e007a7ea0fce0be9468e1f9e9a588f85c1cfb6a3e4c227ef26fb6c6e1be8" exitCode=0 Feb 24 10:12:21 crc kubenswrapper[4985]: I0224 10:12:21.657934 4985 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-msgzd" event={"ID":"720477c9-8e44-43cf-a9ba-5ac5b96fe65b","Type":"ContainerDied","Data":"e981e007a7ea0fce0be9468e1f9e9a588f85c1cfb6a3e4c227ef26fb6c6e1be8"} Feb 24 10:12:21 crc kubenswrapper[4985]: I0224 10:12:21.664988 4985 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-apiserver/apiserver-76f77b778f-nnr7f" Feb 24 10:12:21 crc kubenswrapper[4985]: I0224 10:12:21.730719 4985 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7qpdq\" (UniqueName: \"kubernetes.io/projected/cd5749ab-5662-41d6-9d8b-663bdd9b7a0b-kube-api-access-7qpdq\") pod \"redhat-marketplace-f9svt\" (UID: \"cd5749ab-5662-41d6-9d8b-663bdd9b7a0b\") " pod="openshift-marketplace/redhat-marketplace-f9svt" Feb 24 10:12:21 crc kubenswrapper[4985]: I0224 10:12:21.730943 4985 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cd5749ab-5662-41d6-9d8b-663bdd9b7a0b-utilities\") pod \"redhat-marketplace-f9svt\" (UID: \"cd5749ab-5662-41d6-9d8b-663bdd9b7a0b\") " pod="openshift-marketplace/redhat-marketplace-f9svt" Feb 24 10:12:21 crc kubenswrapper[4985]: I0224 10:12:21.732205 4985 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cd5749ab-5662-41d6-9d8b-663bdd9b7a0b-catalog-content\") pod \"redhat-marketplace-f9svt\" (UID: \"cd5749ab-5662-41d6-9d8b-663bdd9b7a0b\") " pod="openshift-marketplace/redhat-marketplace-f9svt" Feb 24 10:12:21 crc kubenswrapper[4985]: I0224 10:12:21.751612 4985 patch_prober.go:28] interesting pod/router-default-5444994796-gcwfl container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 24 10:12:21 crc kubenswrapper[4985]: [-]has-synced failed: reason withheld Feb 24 10:12:21 crc kubenswrapper[4985]: [+]process-running ok Feb 24 10:12:21 crc kubenswrapper[4985]: healthz check failed Feb 24 10:12:21 crc kubenswrapper[4985]: I0224 10:12:21.751669 4985 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-gcwfl" podUID="4cc3b979-d591-4106-8874-760925fd10f6" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 24 10:12:21 crc kubenswrapper[4985]: I0224 10:12:21.832909 4985 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cd5749ab-5662-41d6-9d8b-663bdd9b7a0b-catalog-content\") pod \"redhat-marketplace-f9svt\" (UID: \"cd5749ab-5662-41d6-9d8b-663bdd9b7a0b\") " pod="openshift-marketplace/redhat-marketplace-f9svt" Feb 24 10:12:21 crc kubenswrapper[4985]: I0224 10:12:21.833342 4985 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7qpdq\" (UniqueName: \"kubernetes.io/projected/cd5749ab-5662-41d6-9d8b-663bdd9b7a0b-kube-api-access-7qpdq\") pod \"redhat-marketplace-f9svt\" (UID: \"cd5749ab-5662-41d6-9d8b-663bdd9b7a0b\") " pod="openshift-marketplace/redhat-marketplace-f9svt" Feb 24 10:12:21 crc kubenswrapper[4985]: I0224 10:12:21.833412 4985 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cd5749ab-5662-41d6-9d8b-663bdd9b7a0b-utilities\") pod \"redhat-marketplace-f9svt\" (UID: \"cd5749ab-5662-41d6-9d8b-663bdd9b7a0b\") " pod="openshift-marketplace/redhat-marketplace-f9svt" Feb 24 10:12:21 crc kubenswrapper[4985]: I0224 10:12:21.833841 4985 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cd5749ab-5662-41d6-9d8b-663bdd9b7a0b-utilities\") pod \"redhat-marketplace-f9svt\" (UID: \"cd5749ab-5662-41d6-9d8b-663bdd9b7a0b\") " pod="openshift-marketplace/redhat-marketplace-f9svt" Feb 24 10:12:21 crc kubenswrapper[4985]: I0224 10:12:21.833880 4985 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cd5749ab-5662-41d6-9d8b-663bdd9b7a0b-catalog-content\") pod \"redhat-marketplace-f9svt\" (UID: \"cd5749ab-5662-41d6-9d8b-663bdd9b7a0b\") " pod="openshift-marketplace/redhat-marketplace-f9svt" Feb 24 10:12:21 crc kubenswrapper[4985]: I0224 10:12:21.888750 4985 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7qpdq\" (UniqueName: \"kubernetes.io/projected/cd5749ab-5662-41d6-9d8b-663bdd9b7a0b-kube-api-access-7qpdq\") pod \"redhat-marketplace-f9svt\" (UID: \"cd5749ab-5662-41d6-9d8b-663bdd9b7a0b\") " pod="openshift-marketplace/redhat-marketplace-f9svt" Feb 24 10:12:21 crc kubenswrapper[4985]: I0224 10:12:21.902852 4985 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-f9svt" Feb 24 10:12:21 crc kubenswrapper[4985]: I0224 10:12:21.919353 4985 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-xkc65"] Feb 24 10:12:21 crc kubenswrapper[4985]: W0224 10:12:21.967598 4985 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd4340d1a_60cb_4240_87ba_1e468c9c41cf.slice/crio-58a1f37cd01692303c1d26b8d3167fad4f50c7b5a2be8aeaf9195724b4ae8298 WatchSource:0}: Error finding container 58a1f37cd01692303c1d26b8d3167fad4f50c7b5a2be8aeaf9195724b4ae8298: Status 404 returned error can't find the container with id 58a1f37cd01692303c1d26b8d3167fad4f50c7b5a2be8aeaf9195724b4ae8298 Feb 24 10:12:22 crc kubenswrapper[4985]: W0224 10:12:22.022419 4985 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9d751cbb_f2e2_430d_9754_c882a5e924a5.slice/crio-704b170b7b5ef24d28a24fd6837fd1488e5621e73b4aa7ad57c1ccf004e6c393 WatchSource:0}: Error finding container 704b170b7b5ef24d28a24fd6837fd1488e5621e73b4aa7ad57c1ccf004e6c393: Status 404 returned error can't find the container with id 704b170b7b5ef24d28a24fd6837fd1488e5621e73b4aa7ad57c1ccf004e6c393 Feb 24 10:12:22 crc kubenswrapper[4985]: I0224 10:12:22.036116 4985 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-ptllg"] Feb 24 10:12:22 crc kubenswrapper[4985]: W0224 10:12:22.062371 4985 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5fe485a1_e14f_4c09_b5b9_f252bc42b7e8.slice/crio-cf54d706e063bb165c0de13d945b081d0acc22067c25f1f130034f85ed01db46 WatchSource:0}: Error finding container cf54d706e063bb165c0de13d945b081d0acc22067c25f1f130034f85ed01db46: Status 404 returned error can't find the container with id cf54d706e063bb165c0de13d945b081d0acc22067c25f1f130034f85ed01db46 Feb 24 10:12:22 crc kubenswrapper[4985]: W0224 10:12:22.121488 4985 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd8f7b4e0_a896_4c14_bfea_fa6066bfd4c4.slice/crio-39001a38ca7d0ccc7dd63480281feff1af039b99ab2bca450a88ad395117d291 WatchSource:0}: Error finding container 39001a38ca7d0ccc7dd63480281feff1af039b99ab2bca450a88ad395117d291: Status 404 returned error can't find the container with id 39001a38ca7d0ccc7dd63480281feff1af039b99ab2bca450a88ad395117d291 Feb 24 10:12:22 crc kubenswrapper[4985]: I0224 10:12:22.226681 4985 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29532120-q5v4c" Feb 24 10:12:22 crc kubenswrapper[4985]: I0224 10:12:22.282919 4985 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8f668bae-612b-4b75-9490-919e737c6a3b" path="/var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes" Feb 24 10:12:22 crc kubenswrapper[4985]: I0224 10:12:22.299733 4985 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-f9svt"] Feb 24 10:12:22 crc kubenswrapper[4985]: W0224 10:12:22.321030 4985 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcd5749ab_5662_41d6_9d8b_663bdd9b7a0b.slice/crio-29db3a9b7b1869e00ae0d87b80d8ad197bbbdc4ede4cb6af006d8ebb1e569eb1 WatchSource:0}: Error finding container 29db3a9b7b1869e00ae0d87b80d8ad197bbbdc4ede4cb6af006d8ebb1e569eb1: Status 404 returned error can't find the container with id 29db3a9b7b1869e00ae0d87b80d8ad197bbbdc4ede4cb6af006d8ebb1e569eb1 Feb 24 10:12:22 crc kubenswrapper[4985]: I0224 10:12:22.347283 4985 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/2364f27b-fc75-4e56-8122-d6bdcc763b0a-config-volume\") pod \"2364f27b-fc75-4e56-8122-d6bdcc763b0a\" (UID: \"2364f27b-fc75-4e56-8122-d6bdcc763b0a\") " Feb 24 10:12:22 crc kubenswrapper[4985]: I0224 10:12:22.347352 4985 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/2364f27b-fc75-4e56-8122-d6bdcc763b0a-secret-volume\") pod \"2364f27b-fc75-4e56-8122-d6bdcc763b0a\" (UID: \"2364f27b-fc75-4e56-8122-d6bdcc763b0a\") " Feb 24 10:12:22 crc kubenswrapper[4985]: I0224 10:12:22.347377 4985 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rmcsd\" (UniqueName: \"kubernetes.io/projected/2364f27b-fc75-4e56-8122-d6bdcc763b0a-kube-api-access-rmcsd\") pod \"2364f27b-fc75-4e56-8122-d6bdcc763b0a\" (UID: \"2364f27b-fc75-4e56-8122-d6bdcc763b0a\") " Feb 24 10:12:22 crc kubenswrapper[4985]: I0224 10:12:22.348689 4985 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2364f27b-fc75-4e56-8122-d6bdcc763b0a-config-volume" (OuterVolumeSpecName: "config-volume") pod "2364f27b-fc75-4e56-8122-d6bdcc763b0a" (UID: "2364f27b-fc75-4e56-8122-d6bdcc763b0a"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 24 10:12:22 crc kubenswrapper[4985]: I0224 10:12:22.354994 4985 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2364f27b-fc75-4e56-8122-d6bdcc763b0a-kube-api-access-rmcsd" (OuterVolumeSpecName: "kube-api-access-rmcsd") pod "2364f27b-fc75-4e56-8122-d6bdcc763b0a" (UID: "2364f27b-fc75-4e56-8122-d6bdcc763b0a"). InnerVolumeSpecName "kube-api-access-rmcsd". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 24 10:12:22 crc kubenswrapper[4985]: I0224 10:12:22.355235 4985 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2364f27b-fc75-4e56-8122-d6bdcc763b0a-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "2364f27b-fc75-4e56-8122-d6bdcc763b0a" (UID: "2364f27b-fc75-4e56-8122-d6bdcc763b0a"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 24 10:12:22 crc kubenswrapper[4985]: I0224 10:12:22.381314 4985 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-rhlvp"] Feb 24 10:12:22 crc kubenswrapper[4985]: E0224 10:12:22.381849 4985 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2364f27b-fc75-4e56-8122-d6bdcc763b0a" containerName="collect-profiles" Feb 24 10:12:22 crc kubenswrapper[4985]: I0224 10:12:22.382052 4985 state_mem.go:107] "Deleted CPUSet assignment" podUID="2364f27b-fc75-4e56-8122-d6bdcc763b0a" containerName="collect-profiles" Feb 24 10:12:22 crc kubenswrapper[4985]: I0224 10:12:22.382611 4985 memory_manager.go:354] "RemoveStaleState removing state" podUID="2364f27b-fc75-4e56-8122-d6bdcc763b0a" containerName="collect-profiles" Feb 24 10:12:22 crc kubenswrapper[4985]: I0224 10:12:22.384305 4985 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-rhlvp" Feb 24 10:12:22 crc kubenswrapper[4985]: I0224 10:12:22.387766 4985 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Feb 24 10:12:22 crc kubenswrapper[4985]: I0224 10:12:22.395423 4985 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-rhlvp"] Feb 24 10:12:22 crc kubenswrapper[4985]: I0224 10:12:22.448866 4985 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2fd0e2bc-ed7e-4a38-b107-9217d349ad15-catalog-content\") pod \"redhat-operators-rhlvp\" (UID: \"2fd0e2bc-ed7e-4a38-b107-9217d349ad15\") " pod="openshift-marketplace/redhat-operators-rhlvp" Feb 24 10:12:22 crc kubenswrapper[4985]: I0224 10:12:22.448991 4985 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2fd0e2bc-ed7e-4a38-b107-9217d349ad15-utilities\") pod \"redhat-operators-rhlvp\" (UID: \"2fd0e2bc-ed7e-4a38-b107-9217d349ad15\") " pod="openshift-marketplace/redhat-operators-rhlvp" Feb 24 10:12:22 crc kubenswrapper[4985]: I0224 10:12:22.449028 4985 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2psfq\" (UniqueName: \"kubernetes.io/projected/2fd0e2bc-ed7e-4a38-b107-9217d349ad15-kube-api-access-2psfq\") pod \"redhat-operators-rhlvp\" (UID: \"2fd0e2bc-ed7e-4a38-b107-9217d349ad15\") " pod="openshift-marketplace/redhat-operators-rhlvp" Feb 24 10:12:22 crc kubenswrapper[4985]: I0224 10:12:22.449120 4985 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/2364f27b-fc75-4e56-8122-d6bdcc763b0a-secret-volume\") on node \"crc\" DevicePath \"\"" Feb 24 10:12:22 crc kubenswrapper[4985]: I0224 10:12:22.449158 4985 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rmcsd\" (UniqueName: \"kubernetes.io/projected/2364f27b-fc75-4e56-8122-d6bdcc763b0a-kube-api-access-rmcsd\") on node \"crc\" DevicePath \"\"" Feb 24 10:12:22 crc kubenswrapper[4985]: I0224 10:12:22.449168 4985 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/2364f27b-fc75-4e56-8122-d6bdcc763b0a-config-volume\") on node \"crc\" DevicePath \"\"" Feb 24 10:12:22 crc kubenswrapper[4985]: I0224 10:12:22.551328 4985 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2psfq\" (UniqueName: \"kubernetes.io/projected/2fd0e2bc-ed7e-4a38-b107-9217d349ad15-kube-api-access-2psfq\") pod \"redhat-operators-rhlvp\" (UID: \"2fd0e2bc-ed7e-4a38-b107-9217d349ad15\") " pod="openshift-marketplace/redhat-operators-rhlvp" Feb 24 10:12:22 crc kubenswrapper[4985]: I0224 10:12:22.551393 4985 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2fd0e2bc-ed7e-4a38-b107-9217d349ad15-catalog-content\") pod \"redhat-operators-rhlvp\" (UID: \"2fd0e2bc-ed7e-4a38-b107-9217d349ad15\") " pod="openshift-marketplace/redhat-operators-rhlvp" Feb 24 10:12:22 crc kubenswrapper[4985]: I0224 10:12:22.551496 4985 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2fd0e2bc-ed7e-4a38-b107-9217d349ad15-utilities\") pod \"redhat-operators-rhlvp\" (UID: \"2fd0e2bc-ed7e-4a38-b107-9217d349ad15\") " pod="openshift-marketplace/redhat-operators-rhlvp" Feb 24 10:12:22 crc kubenswrapper[4985]: I0224 10:12:22.552165 4985 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2fd0e2bc-ed7e-4a38-b107-9217d349ad15-utilities\") pod \"redhat-operators-rhlvp\" (UID: \"2fd0e2bc-ed7e-4a38-b107-9217d349ad15\") " pod="openshift-marketplace/redhat-operators-rhlvp" Feb 24 10:12:22 crc kubenswrapper[4985]: I0224 10:12:22.552157 4985 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2fd0e2bc-ed7e-4a38-b107-9217d349ad15-catalog-content\") pod \"redhat-operators-rhlvp\" (UID: \"2fd0e2bc-ed7e-4a38-b107-9217d349ad15\") " pod="openshift-marketplace/redhat-operators-rhlvp" Feb 24 10:12:22 crc kubenswrapper[4985]: I0224 10:12:22.577065 4985 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2psfq\" (UniqueName: \"kubernetes.io/projected/2fd0e2bc-ed7e-4a38-b107-9217d349ad15-kube-api-access-2psfq\") pod \"redhat-operators-rhlvp\" (UID: \"2fd0e2bc-ed7e-4a38-b107-9217d349ad15\") " pod="openshift-marketplace/redhat-operators-rhlvp" Feb 24 10:12:22 crc kubenswrapper[4985]: I0224 10:12:22.674718 4985 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"c7d1f2386d2d86d0ab95f40fdcc8110229e22a9677ccfaeb80c628ea8a9da9b1"} Feb 24 10:12:22 crc kubenswrapper[4985]: I0224 10:12:22.674813 4985 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"49f93ae7de36ce51bcf44aa2875b000fc353940c869dba281340f04e9c08b5fe"} Feb 24 10:12:22 crc kubenswrapper[4985]: I0224 10:12:22.675176 4985 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 24 10:12:22 crc kubenswrapper[4985]: I0224 10:12:22.677640 4985 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-xkc65" event={"ID":"d4340d1a-60cb-4240-87ba-1e468c9c41cf","Type":"ContainerStarted","Data":"58a1f37cd01692303c1d26b8d3167fad4f50c7b5a2be8aeaf9195724b4ae8298"} Feb 24 10:12:22 crc kubenswrapper[4985]: I0224 10:12:22.683126 4985 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"2455c9b7bf82f386d49cff82d257bb26f4f74deb1bd9fa922450ba51a526d23c"} Feb 24 10:12:22 crc kubenswrapper[4985]: I0224 10:12:22.683163 4985 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"cf54d706e063bb165c0de13d945b081d0acc22067c25f1f130034f85ed01db46"} Feb 24 10:12:22 crc kubenswrapper[4985]: I0224 10:12:22.694820 4985 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"d69efad7b1f33876ebcc2ec1ec89c21a6dea3b69382519312f805f3b503264cd"} Feb 24 10:12:22 crc kubenswrapper[4985]: I0224 10:12:22.694903 4985 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"704b170b7b5ef24d28a24fd6837fd1488e5621e73b4aa7ad57c1ccf004e6c393"} Feb 24 10:12:22 crc kubenswrapper[4985]: I0224 10:12:22.698591 4985 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-rhlvp" Feb 24 10:12:22 crc kubenswrapper[4985]: I0224 10:12:22.704139 4985 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29532120-q5v4c" event={"ID":"2364f27b-fc75-4e56-8122-d6bdcc763b0a","Type":"ContainerDied","Data":"f60f137172f390d3b858db92e0e41ecf78bbfc35625bb45e4b106135474184df"} Feb 24 10:12:22 crc kubenswrapper[4985]: I0224 10:12:22.704177 4985 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f60f137172f390d3b858db92e0e41ecf78bbfc35625bb45e4b106135474184df" Feb 24 10:12:22 crc kubenswrapper[4985]: I0224 10:12:22.704255 4985 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29532120-q5v4c" Feb 24 10:12:22 crc kubenswrapper[4985]: I0224 10:12:22.713428 4985 patch_prober.go:28] interesting pod/downloads-7954f5f757-zjjx8 container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.14:8080/\": dial tcp 10.217.0.14:8080: connect: connection refused" start-of-body= Feb 24 10:12:22 crc kubenswrapper[4985]: I0224 10:12:22.713477 4985 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-zjjx8" podUID="bce7d3e8-b855-43ae-8527-fbc14ac50521" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.14:8080/\": dial tcp 10.217.0.14:8080: connect: connection refused" Feb 24 10:12:22 crc kubenswrapper[4985]: I0224 10:12:22.713927 4985 patch_prober.go:28] interesting pod/downloads-7954f5f757-zjjx8 container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.14:8080/\": dial tcp 10.217.0.14:8080: connect: connection refused" start-of-body= Feb 24 10:12:22 crc kubenswrapper[4985]: I0224 10:12:22.713949 4985 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-zjjx8" podUID="bce7d3e8-b855-43ae-8527-fbc14ac50521" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.14:8080/\": dial tcp 10.217.0.14:8080: connect: connection refused" Feb 24 10:12:22 crc kubenswrapper[4985]: I0224 10:12:22.725601 4985 generic.go:334] "Generic (PLEG): container finished" podID="cd5749ab-5662-41d6-9d8b-663bdd9b7a0b" containerID="08043856d6e575b46ad8acdc69e420559f84d0f9606f18f6c8c1a6ba27ec67f4" exitCode=0 Feb 24 10:12:22 crc kubenswrapper[4985]: I0224 10:12:22.725665 4985 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-f9svt" event={"ID":"cd5749ab-5662-41d6-9d8b-663bdd9b7a0b","Type":"ContainerDied","Data":"08043856d6e575b46ad8acdc69e420559f84d0f9606f18f6c8c1a6ba27ec67f4"} Feb 24 10:12:22 crc kubenswrapper[4985]: I0224 10:12:22.725717 4985 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-f9svt" event={"ID":"cd5749ab-5662-41d6-9d8b-663bdd9b7a0b","Type":"ContainerStarted","Data":"29db3a9b7b1869e00ae0d87b80d8ad197bbbdc4ede4cb6af006d8ebb1e569eb1"} Feb 24 10:12:22 crc kubenswrapper[4985]: I0224 10:12:22.735616 4985 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ingress/router-default-5444994796-gcwfl" Feb 24 10:12:22 crc kubenswrapper[4985]: I0224 10:12:22.736811 4985 generic.go:334] "Generic (PLEG): container finished" podID="d8f7b4e0-a896-4c14-bfea-fa6066bfd4c4" containerID="167acfba7cdf3cb566672b5c66f5af5dad7faf042c89cad0f57fafefc90d5cdd" exitCode=0 Feb 24 10:12:22 crc kubenswrapper[4985]: I0224 10:12:22.737846 4985 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ptllg" event={"ID":"d8f7b4e0-a896-4c14-bfea-fa6066bfd4c4","Type":"ContainerDied","Data":"167acfba7cdf3cb566672b5c66f5af5dad7faf042c89cad0f57fafefc90d5cdd"} Feb 24 10:12:22 crc kubenswrapper[4985]: I0224 10:12:22.737867 4985 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ptllg" event={"ID":"d8f7b4e0-a896-4c14-bfea-fa6066bfd4c4","Type":"ContainerStarted","Data":"39001a38ca7d0ccc7dd63480281feff1af039b99ab2bca450a88ad395117d291"} Feb 24 10:12:22 crc kubenswrapper[4985]: I0224 10:12:22.739378 4985 patch_prober.go:28] interesting pod/router-default-5444994796-gcwfl container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 24 10:12:22 crc kubenswrapper[4985]: [-]has-synced failed: reason withheld Feb 24 10:12:22 crc kubenswrapper[4985]: [+]process-running ok Feb 24 10:12:22 crc kubenswrapper[4985]: healthz check failed Feb 24 10:12:22 crc kubenswrapper[4985]: I0224 10:12:22.739399 4985 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-gcwfl" podUID="4cc3b979-d591-4106-8874-760925fd10f6" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 24 10:12:22 crc kubenswrapper[4985]: I0224 10:12:22.808264 4985 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-v2cq6"] Feb 24 10:12:22 crc kubenswrapper[4985]: I0224 10:12:22.809676 4985 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-v2cq6" Feb 24 10:12:22 crc kubenswrapper[4985]: I0224 10:12:22.828328 4985 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-v2cq6"] Feb 24 10:12:22 crc kubenswrapper[4985]: I0224 10:12:22.861123 4985 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wql8d\" (UniqueName: \"kubernetes.io/projected/a6f03b61-4740-4538-877d-40390729b5ef-kube-api-access-wql8d\") pod \"redhat-operators-v2cq6\" (UID: \"a6f03b61-4740-4538-877d-40390729b5ef\") " pod="openshift-marketplace/redhat-operators-v2cq6" Feb 24 10:12:22 crc kubenswrapper[4985]: I0224 10:12:22.861301 4985 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a6f03b61-4740-4538-877d-40390729b5ef-utilities\") pod \"redhat-operators-v2cq6\" (UID: \"a6f03b61-4740-4538-877d-40390729b5ef\") " pod="openshift-marketplace/redhat-operators-v2cq6" Feb 24 10:12:22 crc kubenswrapper[4985]: I0224 10:12:22.861328 4985 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a6f03b61-4740-4538-877d-40390729b5ef-catalog-content\") pod \"redhat-operators-v2cq6\" (UID: \"a6f03b61-4740-4538-877d-40390729b5ef\") " pod="openshift-marketplace/redhat-operators-v2cq6" Feb 24 10:12:22 crc kubenswrapper[4985]: I0224 10:12:22.963606 4985 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a6f03b61-4740-4538-877d-40390729b5ef-utilities\") pod \"redhat-operators-v2cq6\" (UID: \"a6f03b61-4740-4538-877d-40390729b5ef\") " pod="openshift-marketplace/redhat-operators-v2cq6" Feb 24 10:12:22 crc kubenswrapper[4985]: I0224 10:12:22.963664 4985 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a6f03b61-4740-4538-877d-40390729b5ef-catalog-content\") pod \"redhat-operators-v2cq6\" (UID: \"a6f03b61-4740-4538-877d-40390729b5ef\") " pod="openshift-marketplace/redhat-operators-v2cq6" Feb 24 10:12:22 crc kubenswrapper[4985]: I0224 10:12:22.963722 4985 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wql8d\" (UniqueName: \"kubernetes.io/projected/a6f03b61-4740-4538-877d-40390729b5ef-kube-api-access-wql8d\") pod \"redhat-operators-v2cq6\" (UID: \"a6f03b61-4740-4538-877d-40390729b5ef\") " pod="openshift-marketplace/redhat-operators-v2cq6" Feb 24 10:12:22 crc kubenswrapper[4985]: I0224 10:12:22.964641 4985 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a6f03b61-4740-4538-877d-40390729b5ef-utilities\") pod \"redhat-operators-v2cq6\" (UID: \"a6f03b61-4740-4538-877d-40390729b5ef\") " pod="openshift-marketplace/redhat-operators-v2cq6" Feb 24 10:12:22 crc kubenswrapper[4985]: I0224 10:12:22.964745 4985 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a6f03b61-4740-4538-877d-40390729b5ef-catalog-content\") pod \"redhat-operators-v2cq6\" (UID: \"a6f03b61-4740-4538-877d-40390729b5ef\") " pod="openshift-marketplace/redhat-operators-v2cq6" Feb 24 10:12:23 crc kubenswrapper[4985]: I0224 10:12:22.999566 4985 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wql8d\" (UniqueName: \"kubernetes.io/projected/a6f03b61-4740-4538-877d-40390729b5ef-kube-api-access-wql8d\") pod \"redhat-operators-v2cq6\" (UID: \"a6f03b61-4740-4538-877d-40390729b5ef\") " pod="openshift-marketplace/redhat-operators-v2cq6" Feb 24 10:12:23 crc kubenswrapper[4985]: I0224 10:12:23.065111 4985 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-rhlvp"] Feb 24 10:12:23 crc kubenswrapper[4985]: I0224 10:12:23.130119 4985 ???:1] "http: TLS handshake error from 192.168.126.11:52720: no serving certificate available for the kubelet" Feb 24 10:12:23 crc kubenswrapper[4985]: I0224 10:12:23.199159 4985 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-v2cq6" Feb 24 10:12:23 crc kubenswrapper[4985]: I0224 10:12:23.323171 4985 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Feb 24 10:12:23 crc kubenswrapper[4985]: I0224 10:12:23.323976 4985 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 24 10:12:23 crc kubenswrapper[4985]: I0224 10:12:23.329357 4985 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager"/"installer-sa-dockercfg-kjl2n" Feb 24 10:12:23 crc kubenswrapper[4985]: I0224 10:12:23.339046 4985 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Feb 24 10:12:23 crc kubenswrapper[4985]: I0224 10:12:23.330134 4985 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager"/"kube-root-ca.crt" Feb 24 10:12:23 crc kubenswrapper[4985]: I0224 10:12:23.372216 4985 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/7066141c-e84c-4c30-a1c8-f00f58f2a704-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"7066141c-e84c-4c30-a1c8-f00f58f2a704\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 24 10:12:23 crc kubenswrapper[4985]: I0224 10:12:23.372342 4985 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/7066141c-e84c-4c30-a1c8-f00f58f2a704-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"7066141c-e84c-4c30-a1c8-f00f58f2a704\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 24 10:12:23 crc kubenswrapper[4985]: I0224 10:12:23.475087 4985 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/7066141c-e84c-4c30-a1c8-f00f58f2a704-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"7066141c-e84c-4c30-a1c8-f00f58f2a704\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 24 10:12:23 crc kubenswrapper[4985]: I0224 10:12:23.475268 4985 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/7066141c-e84c-4c30-a1c8-f00f58f2a704-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"7066141c-e84c-4c30-a1c8-f00f58f2a704\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 24 10:12:23 crc kubenswrapper[4985]: I0224 10:12:23.475448 4985 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/7066141c-e84c-4c30-a1c8-f00f58f2a704-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"7066141c-e84c-4c30-a1c8-f00f58f2a704\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 24 10:12:23 crc kubenswrapper[4985]: I0224 10:12:23.490853 4985 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-rlq9j" Feb 24 10:12:23 crc kubenswrapper[4985]: I0224 10:12:23.494572 4985 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/7066141c-e84c-4c30-a1c8-f00f58f2a704-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"7066141c-e84c-4c30-a1c8-f00f58f2a704\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 24 10:12:23 crc kubenswrapper[4985]: I0224 10:12:23.648188 4985 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 24 10:12:23 crc kubenswrapper[4985]: I0224 10:12:23.740789 4985 patch_prober.go:28] interesting pod/router-default-5444994796-gcwfl container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 24 10:12:23 crc kubenswrapper[4985]: [-]has-synced failed: reason withheld Feb 24 10:12:23 crc kubenswrapper[4985]: [+]process-running ok Feb 24 10:12:23 crc kubenswrapper[4985]: healthz check failed Feb 24 10:12:23 crc kubenswrapper[4985]: I0224 10:12:23.740844 4985 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-gcwfl" podUID="4cc3b979-d591-4106-8874-760925fd10f6" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 24 10:12:23 crc kubenswrapper[4985]: I0224 10:12:23.753036 4985 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-xkc65" event={"ID":"d4340d1a-60cb-4240-87ba-1e468c9c41cf","Type":"ContainerStarted","Data":"44f65940d578d34a5b21f566dd0ec2c3570a49c26c0d63d01473a6ab1da7da29"} Feb 24 10:12:23 crc kubenswrapper[4985]: I0224 10:12:23.753131 4985 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-xkc65" event={"ID":"d4340d1a-60cb-4240-87ba-1e468c9c41cf","Type":"ContainerStarted","Data":"f159b3a220c2ea329d33b7c13b03879f2756e516d760cbd45e9440e0834982fb"} Feb 24 10:12:23 crc kubenswrapper[4985]: I0224 10:12:23.759056 4985 generic.go:334] "Generic (PLEG): container finished" podID="2fd0e2bc-ed7e-4a38-b107-9217d349ad15" containerID="3661b5f3e491389b533ed8a120be3900c7adfbc604c62a0ba448f1aebf7bd121" exitCode=0 Feb 24 10:12:23 crc kubenswrapper[4985]: I0224 10:12:23.759275 4985 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rhlvp" event={"ID":"2fd0e2bc-ed7e-4a38-b107-9217d349ad15","Type":"ContainerDied","Data":"3661b5f3e491389b533ed8a120be3900c7adfbc604c62a0ba448f1aebf7bd121"} Feb 24 10:12:23 crc kubenswrapper[4985]: I0224 10:12:23.759372 4985 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rhlvp" event={"ID":"2fd0e2bc-ed7e-4a38-b107-9217d349ad15","Type":"ContainerStarted","Data":"460fef235535f28e1092b9e0a0826815c7212707e44e41e1eb19d0a0d6840616"} Feb 24 10:12:23 crc kubenswrapper[4985]: I0224 10:12:23.777676 4985 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-xkc65" podStartSLOduration=181.777651928 podStartE2EDuration="3m1.777651928s" podCreationTimestamp="2026-02-24 10:09:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-24 10:12:23.772623594 +0000 UTC m=+228.246816144" watchObservedRunningTime="2026-02-24 10:12:23.777651928 +0000 UTC m=+228.251844488" Feb 24 10:12:23 crc kubenswrapper[4985]: I0224 10:12:23.958496 4985 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-v2cq6"] Feb 24 10:12:24 crc kubenswrapper[4985]: I0224 10:12:24.126437 4985 ???:1] "http: TLS handshake error from 192.168.126.11:52726: no serving certificate available for the kubelet" Feb 24 10:12:24 crc kubenswrapper[4985]: I0224 10:12:24.151110 4985 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Feb 24 10:12:24 crc kubenswrapper[4985]: W0224 10:12:24.176212 4985 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod7066141c_e84c_4c30_a1c8_f00f58f2a704.slice/crio-eaf35099170e3500661086a43819a10785728405f23da7302f648d41a23ad818 WatchSource:0}: Error finding container eaf35099170e3500661086a43819a10785728405f23da7302f648d41a23ad818: Status 404 returned error can't find the container with id eaf35099170e3500661086a43819a10785728405f23da7302f648d41a23ad818 Feb 24 10:12:24 crc kubenswrapper[4985]: I0224 10:12:24.457715 4985 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Feb 24 10:12:24 crc kubenswrapper[4985]: I0224 10:12:24.458533 4985 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 24 10:12:24 crc kubenswrapper[4985]: I0224 10:12:24.461175 4985 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Feb 24 10:12:24 crc kubenswrapper[4985]: I0224 10:12:24.463035 4985 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Feb 24 10:12:24 crc kubenswrapper[4985]: I0224 10:12:24.463582 4985 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Feb 24 10:12:24 crc kubenswrapper[4985]: I0224 10:12:24.494867 4985 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/900a4d40-ef8a-4617-b48c-eaa896ee080f-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"900a4d40-ef8a-4617-b48c-eaa896ee080f\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 24 10:12:24 crc kubenswrapper[4985]: I0224 10:12:24.494981 4985 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/900a4d40-ef8a-4617-b48c-eaa896ee080f-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"900a4d40-ef8a-4617-b48c-eaa896ee080f\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 24 10:12:24 crc kubenswrapper[4985]: I0224 10:12:24.596674 4985 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/900a4d40-ef8a-4617-b48c-eaa896ee080f-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"900a4d40-ef8a-4617-b48c-eaa896ee080f\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 24 10:12:24 crc kubenswrapper[4985]: I0224 10:12:24.596780 4985 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/900a4d40-ef8a-4617-b48c-eaa896ee080f-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"900a4d40-ef8a-4617-b48c-eaa896ee080f\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 24 10:12:24 crc kubenswrapper[4985]: I0224 10:12:24.596876 4985 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/900a4d40-ef8a-4617-b48c-eaa896ee080f-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"900a4d40-ef8a-4617-b48c-eaa896ee080f\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 24 10:12:24 crc kubenswrapper[4985]: I0224 10:12:24.621666 4985 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/900a4d40-ef8a-4617-b48c-eaa896ee080f-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"900a4d40-ef8a-4617-b48c-eaa896ee080f\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 24 10:12:24 crc kubenswrapper[4985]: I0224 10:12:24.740634 4985 patch_prober.go:28] interesting pod/router-default-5444994796-gcwfl container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 24 10:12:24 crc kubenswrapper[4985]: [-]has-synced failed: reason withheld Feb 24 10:12:24 crc kubenswrapper[4985]: [+]process-running ok Feb 24 10:12:24 crc kubenswrapper[4985]: healthz check failed Feb 24 10:12:24 crc kubenswrapper[4985]: I0224 10:12:24.740687 4985 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-gcwfl" podUID="4cc3b979-d591-4106-8874-760925fd10f6" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 24 10:12:24 crc kubenswrapper[4985]: I0224 10:12:24.773803 4985 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 24 10:12:24 crc kubenswrapper[4985]: I0224 10:12:24.776756 4985 generic.go:334] "Generic (PLEG): container finished" podID="a6f03b61-4740-4538-877d-40390729b5ef" containerID="799b1a36fdc0a1c36335330dddcbc49da92cc70753f7ca5090cb4cd4270aaf86" exitCode=0 Feb 24 10:12:24 crc kubenswrapper[4985]: I0224 10:12:24.776847 4985 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-v2cq6" event={"ID":"a6f03b61-4740-4538-877d-40390729b5ef","Type":"ContainerDied","Data":"799b1a36fdc0a1c36335330dddcbc49da92cc70753f7ca5090cb4cd4270aaf86"} Feb 24 10:12:24 crc kubenswrapper[4985]: I0224 10:12:24.776916 4985 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-v2cq6" event={"ID":"a6f03b61-4740-4538-877d-40390729b5ef","Type":"ContainerStarted","Data":"94a0453cf9b4945e1e2166655d10c7aebba6d3d5505aa15dcccaff46e7379aeb"} Feb 24 10:12:24 crc kubenswrapper[4985]: I0224 10:12:24.780570 4985 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"7066141c-e84c-4c30-a1c8-f00f58f2a704","Type":"ContainerStarted","Data":"eaf35099170e3500661086a43819a10785728405f23da7302f648d41a23ad818"} Feb 24 10:12:25 crc kubenswrapper[4985]: I0224 10:12:25.386985 4985 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Feb 24 10:12:25 crc kubenswrapper[4985]: I0224 10:12:25.750751 4985 patch_prober.go:28] interesting pod/router-default-5444994796-gcwfl container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 24 10:12:25 crc kubenswrapper[4985]: [-]has-synced failed: reason withheld Feb 24 10:12:25 crc kubenswrapper[4985]: [+]process-running ok Feb 24 10:12:25 crc kubenswrapper[4985]: healthz check failed Feb 24 10:12:25 crc kubenswrapper[4985]: I0224 10:12:25.750813 4985 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-gcwfl" podUID="4cc3b979-d591-4106-8874-760925fd10f6" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 24 10:12:25 crc kubenswrapper[4985]: I0224 10:12:25.827582 4985 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"900a4d40-ef8a-4617-b48c-eaa896ee080f","Type":"ContainerStarted","Data":"86bad90314814d9a56d677b29ff3b390ab892c45a2c70a749cb4ccd8dd96045a"} Feb 24 10:12:25 crc kubenswrapper[4985]: I0224 10:12:25.835339 4985 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"7066141c-e84c-4c30-a1c8-f00f58f2a704","Type":"ContainerStarted","Data":"34fe4a151d18655f43e84afdbcc3337ed00c5af032c85e809d11ac3dc6c1062d"} Feb 24 10:12:26 crc kubenswrapper[4985]: I0224 10:12:26.738225 4985 patch_prober.go:28] interesting pod/router-default-5444994796-gcwfl container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 24 10:12:26 crc kubenswrapper[4985]: [-]has-synced failed: reason withheld Feb 24 10:12:26 crc kubenswrapper[4985]: [+]process-running ok Feb 24 10:12:26 crc kubenswrapper[4985]: healthz check failed Feb 24 10:12:26 crc kubenswrapper[4985]: I0224 10:12:26.738548 4985 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-gcwfl" podUID="4cc3b979-d591-4106-8874-760925fd10f6" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 24 10:12:26 crc kubenswrapper[4985]: I0224 10:12:26.871470 4985 generic.go:334] "Generic (PLEG): container finished" podID="900a4d40-ef8a-4617-b48c-eaa896ee080f" containerID="a878e2c2d0d80e3ecd1c7dd06dc41625c02598ddd9c8622daf911baa6b263a64" exitCode=0 Feb 24 10:12:26 crc kubenswrapper[4985]: I0224 10:12:26.871581 4985 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"900a4d40-ef8a-4617-b48c-eaa896ee080f","Type":"ContainerDied","Data":"a878e2c2d0d80e3ecd1c7dd06dc41625c02598ddd9c8622daf911baa6b263a64"} Feb 24 10:12:26 crc kubenswrapper[4985]: I0224 10:12:26.900571 4985 generic.go:334] "Generic (PLEG): container finished" podID="7066141c-e84c-4c30-a1c8-f00f58f2a704" containerID="34fe4a151d18655f43e84afdbcc3337ed00c5af032c85e809d11ac3dc6c1062d" exitCode=0 Feb 24 10:12:26 crc kubenswrapper[4985]: I0224 10:12:26.900623 4985 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"7066141c-e84c-4c30-a1c8-f00f58f2a704","Type":"ContainerDied","Data":"34fe4a151d18655f43e84afdbcc3337ed00c5af032c85e809d11ac3dc6c1062d"} Feb 24 10:12:27 crc kubenswrapper[4985]: I0224 10:12:27.487185 4985 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 24 10:12:27 crc kubenswrapper[4985]: I0224 10:12:27.631914 4985 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/7066141c-e84c-4c30-a1c8-f00f58f2a704-kubelet-dir\") pod \"7066141c-e84c-4c30-a1c8-f00f58f2a704\" (UID: \"7066141c-e84c-4c30-a1c8-f00f58f2a704\") " Feb 24 10:12:27 crc kubenswrapper[4985]: I0224 10:12:27.632077 4985 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/7066141c-e84c-4c30-a1c8-f00f58f2a704-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "7066141c-e84c-4c30-a1c8-f00f58f2a704" (UID: "7066141c-e84c-4c30-a1c8-f00f58f2a704"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 24 10:12:27 crc kubenswrapper[4985]: I0224 10:12:27.632118 4985 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/7066141c-e84c-4c30-a1c8-f00f58f2a704-kube-api-access\") pod \"7066141c-e84c-4c30-a1c8-f00f58f2a704\" (UID: \"7066141c-e84c-4c30-a1c8-f00f58f2a704\") " Feb 24 10:12:27 crc kubenswrapper[4985]: I0224 10:12:27.632353 4985 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/7066141c-e84c-4c30-a1c8-f00f58f2a704-kubelet-dir\") on node \"crc\" DevicePath \"\"" Feb 24 10:12:27 crc kubenswrapper[4985]: I0224 10:12:27.657198 4985 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7066141c-e84c-4c30-a1c8-f00f58f2a704-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "7066141c-e84c-4c30-a1c8-f00f58f2a704" (UID: "7066141c-e84c-4c30-a1c8-f00f58f2a704"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 24 10:12:27 crc kubenswrapper[4985]: I0224 10:12:27.734477 4985 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/7066141c-e84c-4c30-a1c8-f00f58f2a704-kube-api-access\") on node \"crc\" DevicePath \"\"" Feb 24 10:12:27 crc kubenswrapper[4985]: I0224 10:12:27.738146 4985 patch_prober.go:28] interesting pod/router-default-5444994796-gcwfl container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 24 10:12:27 crc kubenswrapper[4985]: [-]has-synced failed: reason withheld Feb 24 10:12:27 crc kubenswrapper[4985]: [+]process-running ok Feb 24 10:12:27 crc kubenswrapper[4985]: healthz check failed Feb 24 10:12:27 crc kubenswrapper[4985]: I0224 10:12:27.738204 4985 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-gcwfl" podUID="4cc3b979-d591-4106-8874-760925fd10f6" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 24 10:12:27 crc kubenswrapper[4985]: I0224 10:12:27.911144 4985 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 24 10:12:27 crc kubenswrapper[4985]: I0224 10:12:27.915811 4985 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"7066141c-e84c-4c30-a1c8-f00f58f2a704","Type":"ContainerDied","Data":"eaf35099170e3500661086a43819a10785728405f23da7302f648d41a23ad818"} Feb 24 10:12:27 crc kubenswrapper[4985]: I0224 10:12:27.915936 4985 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="eaf35099170e3500661086a43819a10785728405f23da7302f648d41a23ad818" Feb 24 10:12:28 crc kubenswrapper[4985]: I0224 10:12:28.253851 4985 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 24 10:12:28 crc kubenswrapper[4985]: I0224 10:12:28.446237 4985 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/900a4d40-ef8a-4617-b48c-eaa896ee080f-kube-api-access\") pod \"900a4d40-ef8a-4617-b48c-eaa896ee080f\" (UID: \"900a4d40-ef8a-4617-b48c-eaa896ee080f\") " Feb 24 10:12:28 crc kubenswrapper[4985]: I0224 10:12:28.446309 4985 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/900a4d40-ef8a-4617-b48c-eaa896ee080f-kubelet-dir\") pod \"900a4d40-ef8a-4617-b48c-eaa896ee080f\" (UID: \"900a4d40-ef8a-4617-b48c-eaa896ee080f\") " Feb 24 10:12:28 crc kubenswrapper[4985]: I0224 10:12:28.446584 4985 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/900a4d40-ef8a-4617-b48c-eaa896ee080f-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "900a4d40-ef8a-4617-b48c-eaa896ee080f" (UID: "900a4d40-ef8a-4617-b48c-eaa896ee080f"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 24 10:12:28 crc kubenswrapper[4985]: I0224 10:12:28.449485 4985 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/900a4d40-ef8a-4617-b48c-eaa896ee080f-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "900a4d40-ef8a-4617-b48c-eaa896ee080f" (UID: "900a4d40-ef8a-4617-b48c-eaa896ee080f"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 24 10:12:28 crc kubenswrapper[4985]: I0224 10:12:28.502715 4985 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-w5s2c" Feb 24 10:12:28 crc kubenswrapper[4985]: I0224 10:12:28.547355 4985 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/900a4d40-ef8a-4617-b48c-eaa896ee080f-kube-api-access\") on node \"crc\" DevicePath \"\"" Feb 24 10:12:28 crc kubenswrapper[4985]: I0224 10:12:28.547386 4985 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/900a4d40-ef8a-4617-b48c-eaa896ee080f-kubelet-dir\") on node \"crc\" DevicePath \"\"" Feb 24 10:12:28 crc kubenswrapper[4985]: I0224 10:12:28.737721 4985 patch_prober.go:28] interesting pod/router-default-5444994796-gcwfl container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 24 10:12:28 crc kubenswrapper[4985]: [-]has-synced failed: reason withheld Feb 24 10:12:28 crc kubenswrapper[4985]: [+]process-running ok Feb 24 10:12:28 crc kubenswrapper[4985]: healthz check failed Feb 24 10:12:28 crc kubenswrapper[4985]: I0224 10:12:28.737766 4985 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-gcwfl" podUID="4cc3b979-d591-4106-8874-760925fd10f6" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 24 10:12:28 crc kubenswrapper[4985]: I0224 10:12:28.920369 4985 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"900a4d40-ef8a-4617-b48c-eaa896ee080f","Type":"ContainerDied","Data":"86bad90314814d9a56d677b29ff3b390ab892c45a2c70a749cb4ccd8dd96045a"} Feb 24 10:12:28 crc kubenswrapper[4985]: I0224 10:12:28.920413 4985 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="86bad90314814d9a56d677b29ff3b390ab892c45a2c70a749cb4ccd8dd96045a" Feb 24 10:12:28 crc kubenswrapper[4985]: I0224 10:12:28.920420 4985 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 24 10:12:29 crc kubenswrapper[4985]: I0224 10:12:29.737881 4985 patch_prober.go:28] interesting pod/router-default-5444994796-gcwfl container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 24 10:12:29 crc kubenswrapper[4985]: [-]has-synced failed: reason withheld Feb 24 10:12:29 crc kubenswrapper[4985]: [+]process-running ok Feb 24 10:12:29 crc kubenswrapper[4985]: healthz check failed Feb 24 10:12:29 crc kubenswrapper[4985]: I0224 10:12:29.737965 4985 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-gcwfl" podUID="4cc3b979-d591-4106-8874-760925fd10f6" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 24 10:12:30 crc kubenswrapper[4985]: I0224 10:12:30.630919 4985 patch_prober.go:28] interesting pod/console-f9d7485db-d7v4s container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.8:8443/health\": dial tcp 10.217.0.8:8443: connect: connection refused" start-of-body= Feb 24 10:12:30 crc kubenswrapper[4985]: I0224 10:12:30.631348 4985 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-d7v4s" podUID="8c5e6fc2-42c9-4794-91cc-1f74adf686db" containerName="console" probeResult="failure" output="Get \"https://10.217.0.8:8443/health\": dial tcp 10.217.0.8:8443: connect: connection refused" Feb 24 10:12:30 crc kubenswrapper[4985]: I0224 10:12:30.738113 4985 patch_prober.go:28] interesting pod/router-default-5444994796-gcwfl container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 24 10:12:30 crc kubenswrapper[4985]: [-]has-synced failed: reason withheld Feb 24 10:12:30 crc kubenswrapper[4985]: [+]process-running ok Feb 24 10:12:30 crc kubenswrapper[4985]: healthz check failed Feb 24 10:12:30 crc kubenswrapper[4985]: I0224 10:12:30.738188 4985 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-gcwfl" podUID="4cc3b979-d591-4106-8874-760925fd10f6" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 24 10:12:31 crc kubenswrapper[4985]: I0224 10:12:31.736658 4985 patch_prober.go:28] interesting pod/router-default-5444994796-gcwfl container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 24 10:12:31 crc kubenswrapper[4985]: [-]has-synced failed: reason withheld Feb 24 10:12:31 crc kubenswrapper[4985]: [+]process-running ok Feb 24 10:12:31 crc kubenswrapper[4985]: healthz check failed Feb 24 10:12:31 crc kubenswrapper[4985]: I0224 10:12:31.736706 4985 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-gcwfl" podUID="4cc3b979-d591-4106-8874-760925fd10f6" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 24 10:12:32 crc kubenswrapper[4985]: I0224 10:12:32.712105 4985 patch_prober.go:28] interesting pod/downloads-7954f5f757-zjjx8 container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.14:8080/\": dial tcp 10.217.0.14:8080: connect: connection refused" start-of-body= Feb 24 10:12:32 crc kubenswrapper[4985]: I0224 10:12:32.712161 4985 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-zjjx8" podUID="bce7d3e8-b855-43ae-8527-fbc14ac50521" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.14:8080/\": dial tcp 10.217.0.14:8080: connect: connection refused" Feb 24 10:12:32 crc kubenswrapper[4985]: I0224 10:12:32.713259 4985 patch_prober.go:28] interesting pod/downloads-7954f5f757-zjjx8 container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.14:8080/\": dial tcp 10.217.0.14:8080: connect: connection refused" start-of-body= Feb 24 10:12:32 crc kubenswrapper[4985]: I0224 10:12:32.713288 4985 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-zjjx8" podUID="bce7d3e8-b855-43ae-8527-fbc14ac50521" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.14:8080/\": dial tcp 10.217.0.14:8080: connect: connection refused" Feb 24 10:12:32 crc kubenswrapper[4985]: I0224 10:12:32.737075 4985 patch_prober.go:28] interesting pod/router-default-5444994796-gcwfl container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 24 10:12:32 crc kubenswrapper[4985]: [-]has-synced failed: reason withheld Feb 24 10:12:32 crc kubenswrapper[4985]: [+]process-running ok Feb 24 10:12:32 crc kubenswrapper[4985]: healthz check failed Feb 24 10:12:32 crc kubenswrapper[4985]: I0224 10:12:32.737146 4985 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-gcwfl" podUID="4cc3b979-d591-4106-8874-760925fd10f6" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 24 10:12:33 crc kubenswrapper[4985]: I0224 10:12:33.737508 4985 patch_prober.go:28] interesting pod/router-default-5444994796-gcwfl container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 24 10:12:33 crc kubenswrapper[4985]: [-]has-synced failed: reason withheld Feb 24 10:12:33 crc kubenswrapper[4985]: [+]process-running ok Feb 24 10:12:33 crc kubenswrapper[4985]: healthz check failed Feb 24 10:12:33 crc kubenswrapper[4985]: I0224 10:12:33.737924 4985 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-gcwfl" podUID="4cc3b979-d591-4106-8874-760925fd10f6" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 24 10:12:34 crc kubenswrapper[4985]: I0224 10:12:34.736956 4985 patch_prober.go:28] interesting pod/router-default-5444994796-gcwfl container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 24 10:12:34 crc kubenswrapper[4985]: [-]has-synced failed: reason withheld Feb 24 10:12:34 crc kubenswrapper[4985]: [+]process-running ok Feb 24 10:12:34 crc kubenswrapper[4985]: healthz check failed Feb 24 10:12:34 crc kubenswrapper[4985]: I0224 10:12:34.737258 4985 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-gcwfl" podUID="4cc3b979-d591-4106-8874-760925fd10f6" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 24 10:12:35 crc kubenswrapper[4985]: I0224 10:12:35.229625 4985 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-7fd5bb98f6-xk6s7"] Feb 24 10:12:35 crc kubenswrapper[4985]: I0224 10:12:35.229903 4985 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-7fd5bb98f6-xk6s7" podUID="b7825dac-b973-4913-bf15-eb8fc3fe6236" containerName="controller-manager" containerID="cri-o://4d31b00cfb765cd97b59b6002153ed8950bbc7226dbf7476eacb407215e52cb5" gracePeriod=30 Feb 24 10:12:35 crc kubenswrapper[4985]: I0224 10:12:35.252641 4985 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-ddbfd44ff-n5gs7"] Feb 24 10:12:35 crc kubenswrapper[4985]: I0224 10:12:35.252995 4985 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-ddbfd44ff-n5gs7" podUID="86bcd109-4d8a-427f-9c07-1e88c6e1ea22" containerName="route-controller-manager" containerID="cri-o://1a04a9fa1abe62c3dd7a539a0b7bea961f22b45e056bbfd47c27122efcb4e73b" gracePeriod=30 Feb 24 10:12:35 crc kubenswrapper[4985]: I0224 10:12:35.737709 4985 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-5444994796-gcwfl" Feb 24 10:12:35 crc kubenswrapper[4985]: I0224 10:12:35.739575 4985 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-5444994796-gcwfl" Feb 24 10:12:36 crc kubenswrapper[4985]: I0224 10:12:36.980112 4985 generic.go:334] "Generic (PLEG): container finished" podID="b7825dac-b973-4913-bf15-eb8fc3fe6236" containerID="4d31b00cfb765cd97b59b6002153ed8950bbc7226dbf7476eacb407215e52cb5" exitCode=0 Feb 24 10:12:36 crc kubenswrapper[4985]: I0224 10:12:36.980271 4985 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-7fd5bb98f6-xk6s7" event={"ID":"b7825dac-b973-4913-bf15-eb8fc3fe6236","Type":"ContainerDied","Data":"4d31b00cfb765cd97b59b6002153ed8950bbc7226dbf7476eacb407215e52cb5"} Feb 24 10:12:36 crc kubenswrapper[4985]: I0224 10:12:36.982085 4985 generic.go:334] "Generic (PLEG): container finished" podID="86bcd109-4d8a-427f-9c07-1e88c6e1ea22" containerID="1a04a9fa1abe62c3dd7a539a0b7bea961f22b45e056bbfd47c27122efcb4e73b" exitCode=0 Feb 24 10:12:36 crc kubenswrapper[4985]: I0224 10:12:36.982148 4985 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-ddbfd44ff-n5gs7" event={"ID":"86bcd109-4d8a-427f-9c07-1e88c6e1ea22","Type":"ContainerDied","Data":"1a04a9fa1abe62c3dd7a539a0b7bea961f22b45e056bbfd47c27122efcb4e73b"} Feb 24 10:12:38 crc kubenswrapper[4985]: I0224 10:12:38.271699 4985 patch_prober.go:28] interesting pod/route-controller-manager-ddbfd44ff-n5gs7 container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.45:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Feb 24 10:12:38 crc kubenswrapper[4985]: I0224 10:12:38.271753 4985 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-ddbfd44ff-n5gs7" podUID="86bcd109-4d8a-427f-9c07-1e88c6e1ea22" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.45:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Feb 24 10:12:38 crc kubenswrapper[4985]: I0224 10:12:38.286257 4985 patch_prober.go:28] interesting pod/controller-manager-7fd5bb98f6-xk6s7 container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.44:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Feb 24 10:12:38 crc kubenswrapper[4985]: I0224 10:12:38.286307 4985 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-7fd5bb98f6-xk6s7" podUID="b7825dac-b973-4913-bf15-eb8fc3fe6236" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.44:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Feb 24 10:12:40 crc kubenswrapper[4985]: I0224 10:12:40.636706 4985 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-f9d7485db-d7v4s" Feb 24 10:12:40 crc kubenswrapper[4985]: I0224 10:12:40.645127 4985 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-f9d7485db-d7v4s" Feb 24 10:12:40 crc kubenswrapper[4985]: I0224 10:12:40.917638 4985 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-697d97f7c8-ggvx6" Feb 24 10:12:42 crc kubenswrapper[4985]: I0224 10:12:42.717163 4985 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-7954f5f757-zjjx8" Feb 24 10:12:43 crc kubenswrapper[4985]: I0224 10:12:43.624854 4985 patch_prober.go:28] interesting pod/machine-config-daemon-hq52w container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 24 10:12:43 crc kubenswrapper[4985]: I0224 10:12:43.625002 4985 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hq52w" podUID="11c1c7b8-18df-4583-849f-76b62544344b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 24 10:12:44 crc kubenswrapper[4985]: I0224 10:12:44.651736 4985 ???:1] "http: TLS handshake error from 192.168.126.11:35672: no serving certificate available for the kubelet" Feb 24 10:12:48 crc kubenswrapper[4985]: I0224 10:12:48.272645 4985 patch_prober.go:28] interesting pod/route-controller-manager-ddbfd44ff-n5gs7 container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.45:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Feb 24 10:12:48 crc kubenswrapper[4985]: I0224 10:12:48.273014 4985 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-ddbfd44ff-n5gs7" podUID="86bcd109-4d8a-427f-9c07-1e88c6e1ea22" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.45:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Feb 24 10:12:48 crc kubenswrapper[4985]: I0224 10:12:48.285765 4985 patch_prober.go:28] interesting pod/controller-manager-7fd5bb98f6-xk6s7 container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.44:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Feb 24 10:12:48 crc kubenswrapper[4985]: I0224 10:12:48.285920 4985 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-7fd5bb98f6-xk6s7" podUID="b7825dac-b973-4913-bf15-eb8fc3fe6236" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.44:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Feb 24 10:12:53 crc kubenswrapper[4985]: I0224 10:12:53.581381 4985 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-hfkq2" Feb 24 10:12:55 crc kubenswrapper[4985]: I0224 10:12:55.463409 4985 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Feb 24 10:12:55 crc kubenswrapper[4985]: E0224 10:12:55.463687 4985 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7066141c-e84c-4c30-a1c8-f00f58f2a704" containerName="pruner" Feb 24 10:12:55 crc kubenswrapper[4985]: I0224 10:12:55.463701 4985 state_mem.go:107] "Deleted CPUSet assignment" podUID="7066141c-e84c-4c30-a1c8-f00f58f2a704" containerName="pruner" Feb 24 10:12:55 crc kubenswrapper[4985]: E0224 10:12:55.463717 4985 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="900a4d40-ef8a-4617-b48c-eaa896ee080f" containerName="pruner" Feb 24 10:12:55 crc kubenswrapper[4985]: I0224 10:12:55.463725 4985 state_mem.go:107] "Deleted CPUSet assignment" podUID="900a4d40-ef8a-4617-b48c-eaa896ee080f" containerName="pruner" Feb 24 10:12:55 crc kubenswrapper[4985]: I0224 10:12:55.463842 4985 memory_manager.go:354] "RemoveStaleState removing state" podUID="7066141c-e84c-4c30-a1c8-f00f58f2a704" containerName="pruner" Feb 24 10:12:55 crc kubenswrapper[4985]: I0224 10:12:55.463879 4985 memory_manager.go:354] "RemoveStaleState removing state" podUID="900a4d40-ef8a-4617-b48c-eaa896ee080f" containerName="pruner" Feb 24 10:12:55 crc kubenswrapper[4985]: I0224 10:12:55.464383 4985 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 24 10:12:55 crc kubenswrapper[4985]: I0224 10:12:55.466715 4985 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Feb 24 10:12:55 crc kubenswrapper[4985]: I0224 10:12:55.466761 4985 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Feb 24 10:12:55 crc kubenswrapper[4985]: I0224 10:12:55.472957 4985 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Feb 24 10:12:55 crc kubenswrapper[4985]: I0224 10:12:55.626038 4985 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/646342de-73ea-4a23-b414-66b824660b5e-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"646342de-73ea-4a23-b414-66b824660b5e\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 24 10:12:55 crc kubenswrapper[4985]: I0224 10:12:55.626141 4985 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/646342de-73ea-4a23-b414-66b824660b5e-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"646342de-73ea-4a23-b414-66b824660b5e\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 24 10:12:55 crc kubenswrapper[4985]: I0224 10:12:55.727928 4985 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/646342de-73ea-4a23-b414-66b824660b5e-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"646342de-73ea-4a23-b414-66b824660b5e\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 24 10:12:55 crc kubenswrapper[4985]: I0224 10:12:55.728306 4985 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/646342de-73ea-4a23-b414-66b824660b5e-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"646342de-73ea-4a23-b414-66b824660b5e\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 24 10:12:55 crc kubenswrapper[4985]: I0224 10:12:55.728018 4985 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/646342de-73ea-4a23-b414-66b824660b5e-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"646342de-73ea-4a23-b414-66b824660b5e\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 24 10:12:55 crc kubenswrapper[4985]: I0224 10:12:55.748093 4985 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/646342de-73ea-4a23-b414-66b824660b5e-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"646342de-73ea-4a23-b414-66b824660b5e\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 24 10:12:55 crc kubenswrapper[4985]: E0224 10:12:55.764660 4985 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Feb 24 10:12:55 crc kubenswrapper[4985]: E0224 10:12:55.764827 4985 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-7qpdq,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-f9svt_openshift-marketplace(cd5749ab-5662-41d6-9d8b-663bdd9b7a0b): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Feb 24 10:12:55 crc kubenswrapper[4985]: E0224 10:12:55.766301 4985 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-marketplace-f9svt" podUID="cd5749ab-5662-41d6-9d8b-663bdd9b7a0b" Feb 24 10:12:55 crc kubenswrapper[4985]: I0224 10:12:55.787418 4985 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 24 10:12:55 crc kubenswrapper[4985]: E0224 10:12:55.904028 4985 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Feb 24 10:12:55 crc kubenswrapper[4985]: E0224 10:12:55.904177 4985 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-6s6dj,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-ptllg_openshift-marketplace(d8f7b4e0-a896-4c14-bfea-fa6066bfd4c4): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Feb 24 10:12:55 crc kubenswrapper[4985]: E0224 10:12:55.905385 4985 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-marketplace-ptllg" podUID="d8f7b4e0-a896-4c14-bfea-fa6066bfd4c4" Feb 24 10:12:58 crc kubenswrapper[4985]: I0224 10:12:58.272373 4985 patch_prober.go:28] interesting pod/route-controller-manager-ddbfd44ff-n5gs7 container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.45:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Feb 24 10:12:58 crc kubenswrapper[4985]: I0224 10:12:58.272422 4985 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-ddbfd44ff-n5gs7" podUID="86bcd109-4d8a-427f-9c07-1e88c6e1ea22" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.45:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Feb 24 10:12:58 crc kubenswrapper[4985]: I0224 10:12:58.286244 4985 patch_prober.go:28] interesting pod/controller-manager-7fd5bb98f6-xk6s7 container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.44:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Feb 24 10:12:58 crc kubenswrapper[4985]: I0224 10:12:58.286289 4985 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-7fd5bb98f6-xk6s7" podUID="b7825dac-b973-4913-bf15-eb8fc3fe6236" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.44:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Feb 24 10:13:00 crc kubenswrapper[4985]: I0224 10:13:00.053581 4985 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Feb 24 10:13:00 crc kubenswrapper[4985]: I0224 10:13:00.059466 4985 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Feb 24 10:13:00 crc kubenswrapper[4985]: I0224 10:13:00.068270 4985 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Feb 24 10:13:00 crc kubenswrapper[4985]: I0224 10:13:00.212662 4985 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/6e3d6d7b-c88d-4966-8a11-915441e6b482-kube-api-access\") pod \"installer-9-crc\" (UID: \"6e3d6d7b-c88d-4966-8a11-915441e6b482\") " pod="openshift-kube-apiserver/installer-9-crc" Feb 24 10:13:00 crc kubenswrapper[4985]: I0224 10:13:00.212769 4985 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/6e3d6d7b-c88d-4966-8a11-915441e6b482-var-lock\") pod \"installer-9-crc\" (UID: \"6e3d6d7b-c88d-4966-8a11-915441e6b482\") " pod="openshift-kube-apiserver/installer-9-crc" Feb 24 10:13:00 crc kubenswrapper[4985]: I0224 10:13:00.212795 4985 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/6e3d6d7b-c88d-4966-8a11-915441e6b482-kubelet-dir\") pod \"installer-9-crc\" (UID: \"6e3d6d7b-c88d-4966-8a11-915441e6b482\") " pod="openshift-kube-apiserver/installer-9-crc" Feb 24 10:13:00 crc kubenswrapper[4985]: I0224 10:13:00.313927 4985 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/6e3d6d7b-c88d-4966-8a11-915441e6b482-kube-api-access\") pod \"installer-9-crc\" (UID: \"6e3d6d7b-c88d-4966-8a11-915441e6b482\") " pod="openshift-kube-apiserver/installer-9-crc" Feb 24 10:13:00 crc kubenswrapper[4985]: I0224 10:13:00.314011 4985 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/6e3d6d7b-c88d-4966-8a11-915441e6b482-var-lock\") pod \"installer-9-crc\" (UID: \"6e3d6d7b-c88d-4966-8a11-915441e6b482\") " pod="openshift-kube-apiserver/installer-9-crc" Feb 24 10:13:00 crc kubenswrapper[4985]: I0224 10:13:00.314033 4985 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/6e3d6d7b-c88d-4966-8a11-915441e6b482-kubelet-dir\") pod \"installer-9-crc\" (UID: \"6e3d6d7b-c88d-4966-8a11-915441e6b482\") " pod="openshift-kube-apiserver/installer-9-crc" Feb 24 10:13:00 crc kubenswrapper[4985]: I0224 10:13:00.314125 4985 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/6e3d6d7b-c88d-4966-8a11-915441e6b482-kubelet-dir\") pod \"installer-9-crc\" (UID: \"6e3d6d7b-c88d-4966-8a11-915441e6b482\") " pod="openshift-kube-apiserver/installer-9-crc" Feb 24 10:13:00 crc kubenswrapper[4985]: I0224 10:13:00.314153 4985 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/6e3d6d7b-c88d-4966-8a11-915441e6b482-var-lock\") pod \"installer-9-crc\" (UID: \"6e3d6d7b-c88d-4966-8a11-915441e6b482\") " pod="openshift-kube-apiserver/installer-9-crc" Feb 24 10:13:00 crc kubenswrapper[4985]: I0224 10:13:00.333029 4985 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/6e3d6d7b-c88d-4966-8a11-915441e6b482-kube-api-access\") pod \"installer-9-crc\" (UID: \"6e3d6d7b-c88d-4966-8a11-915441e6b482\") " pod="openshift-kube-apiserver/installer-9-crc" Feb 24 10:13:00 crc kubenswrapper[4985]: I0224 10:13:00.427438 4985 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Feb 24 10:13:00 crc kubenswrapper[4985]: E0224 10:13:00.809646 4985 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-ptllg" podUID="d8f7b4e0-a896-4c14-bfea-fa6066bfd4c4" Feb 24 10:13:00 crc kubenswrapper[4985]: E0224 10:13:00.811357 4985 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-f9svt" podUID="cd5749ab-5662-41d6-9d8b-663bdd9b7a0b" Feb 24 10:13:00 crc kubenswrapper[4985]: I0224 10:13:00.876002 4985 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7fd5bb98f6-xk6s7" Feb 24 10:13:00 crc kubenswrapper[4985]: I0224 10:13:00.887660 4985 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-7fd5bb98f6-xk6s7" event={"ID":"b7825dac-b973-4913-bf15-eb8fc3fe6236","Type":"ContainerDied","Data":"ce3b9082ca39488e29ea0b35955742a6353c76102255c8ba68e8ca36f8ff49fe"} Feb 24 10:13:00 crc kubenswrapper[4985]: I0224 10:13:00.887722 4985 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7fd5bb98f6-xk6s7" Feb 24 10:13:00 crc kubenswrapper[4985]: I0224 10:13:00.887726 4985 scope.go:117] "RemoveContainer" containerID="4d31b00cfb765cd97b59b6002153ed8950bbc7226dbf7476eacb407215e52cb5" Feb 24 10:13:00 crc kubenswrapper[4985]: I0224 10:13:00.918113 4985 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-5699f48c5-58p8c"] Feb 24 10:13:00 crc kubenswrapper[4985]: E0224 10:13:00.918483 4985 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b7825dac-b973-4913-bf15-eb8fc3fe6236" containerName="controller-manager" Feb 24 10:13:00 crc kubenswrapper[4985]: I0224 10:13:00.918500 4985 state_mem.go:107] "Deleted CPUSet assignment" podUID="b7825dac-b973-4913-bf15-eb8fc3fe6236" containerName="controller-manager" Feb 24 10:13:00 crc kubenswrapper[4985]: I0224 10:13:00.918646 4985 memory_manager.go:354] "RemoveStaleState removing state" podUID="b7825dac-b973-4913-bf15-eb8fc3fe6236" containerName="controller-manager" Feb 24 10:13:00 crc kubenswrapper[4985]: I0224 10:13:00.919164 4985 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5699f48c5-58p8c" Feb 24 10:13:00 crc kubenswrapper[4985]: I0224 10:13:00.928117 4985 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-5699f48c5-58p8c"] Feb 24 10:13:01 crc kubenswrapper[4985]: I0224 10:13:01.023765 4985 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/b7825dac-b973-4913-bf15-eb8fc3fe6236-proxy-ca-bundles\") pod \"b7825dac-b973-4913-bf15-eb8fc3fe6236\" (UID: \"b7825dac-b973-4913-bf15-eb8fc3fe6236\") " Feb 24 10:13:01 crc kubenswrapper[4985]: I0224 10:13:01.023829 4985 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b7825dac-b973-4913-bf15-eb8fc3fe6236-serving-cert\") pod \"b7825dac-b973-4913-bf15-eb8fc3fe6236\" (UID: \"b7825dac-b973-4913-bf15-eb8fc3fe6236\") " Feb 24 10:13:01 crc kubenswrapper[4985]: I0224 10:13:01.023858 4985 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jk6xc\" (UniqueName: \"kubernetes.io/projected/b7825dac-b973-4913-bf15-eb8fc3fe6236-kube-api-access-jk6xc\") pod \"b7825dac-b973-4913-bf15-eb8fc3fe6236\" (UID: \"b7825dac-b973-4913-bf15-eb8fc3fe6236\") " Feb 24 10:13:01 crc kubenswrapper[4985]: I0224 10:13:01.024353 4985 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/b7825dac-b973-4913-bf15-eb8fc3fe6236-client-ca\") pod \"b7825dac-b973-4913-bf15-eb8fc3fe6236\" (UID: \"b7825dac-b973-4913-bf15-eb8fc3fe6236\") " Feb 24 10:13:01 crc kubenswrapper[4985]: I0224 10:13:01.024396 4985 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b7825dac-b973-4913-bf15-eb8fc3fe6236-config\") pod \"b7825dac-b973-4913-bf15-eb8fc3fe6236\" (UID: \"b7825dac-b973-4913-bf15-eb8fc3fe6236\") " Feb 24 10:13:01 crc kubenswrapper[4985]: I0224 10:13:01.024498 4985 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/13afe978-9654-487a-9742-cf5e2e5a1e00-proxy-ca-bundles\") pod \"controller-manager-5699f48c5-58p8c\" (UID: \"13afe978-9654-487a-9742-cf5e2e5a1e00\") " pod="openshift-controller-manager/controller-manager-5699f48c5-58p8c" Feb 24 10:13:01 crc kubenswrapper[4985]: I0224 10:13:01.024544 4985 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/13afe978-9654-487a-9742-cf5e2e5a1e00-serving-cert\") pod \"controller-manager-5699f48c5-58p8c\" (UID: \"13afe978-9654-487a-9742-cf5e2e5a1e00\") " pod="openshift-controller-manager/controller-manager-5699f48c5-58p8c" Feb 24 10:13:01 crc kubenswrapper[4985]: I0224 10:13:01.024579 4985 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/13afe978-9654-487a-9742-cf5e2e5a1e00-client-ca\") pod \"controller-manager-5699f48c5-58p8c\" (UID: \"13afe978-9654-487a-9742-cf5e2e5a1e00\") " pod="openshift-controller-manager/controller-manager-5699f48c5-58p8c" Feb 24 10:13:01 crc kubenswrapper[4985]: I0224 10:13:01.024738 4985 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b7825dac-b973-4913-bf15-eb8fc3fe6236-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "b7825dac-b973-4913-bf15-eb8fc3fe6236" (UID: "b7825dac-b973-4913-bf15-eb8fc3fe6236"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 24 10:13:01 crc kubenswrapper[4985]: I0224 10:13:01.024807 4985 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/13afe978-9654-487a-9742-cf5e2e5a1e00-config\") pod \"controller-manager-5699f48c5-58p8c\" (UID: \"13afe978-9654-487a-9742-cf5e2e5a1e00\") " pod="openshift-controller-manager/controller-manager-5699f48c5-58p8c" Feb 24 10:13:01 crc kubenswrapper[4985]: I0224 10:13:01.024929 4985 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lt2h6\" (UniqueName: \"kubernetes.io/projected/13afe978-9654-487a-9742-cf5e2e5a1e00-kube-api-access-lt2h6\") pod \"controller-manager-5699f48c5-58p8c\" (UID: \"13afe978-9654-487a-9742-cf5e2e5a1e00\") " pod="openshift-controller-manager/controller-manager-5699f48c5-58p8c" Feb 24 10:13:01 crc kubenswrapper[4985]: I0224 10:13:01.025097 4985 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/b7825dac-b973-4913-bf15-eb8fc3fe6236-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Feb 24 10:13:01 crc kubenswrapper[4985]: I0224 10:13:01.025226 4985 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b7825dac-b973-4913-bf15-eb8fc3fe6236-config" (OuterVolumeSpecName: "config") pod "b7825dac-b973-4913-bf15-eb8fc3fe6236" (UID: "b7825dac-b973-4913-bf15-eb8fc3fe6236"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 24 10:13:01 crc kubenswrapper[4985]: I0224 10:13:01.025378 4985 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b7825dac-b973-4913-bf15-eb8fc3fe6236-client-ca" (OuterVolumeSpecName: "client-ca") pod "b7825dac-b973-4913-bf15-eb8fc3fe6236" (UID: "b7825dac-b973-4913-bf15-eb8fc3fe6236"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 24 10:13:01 crc kubenswrapper[4985]: I0224 10:13:01.027293 4985 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b7825dac-b973-4913-bf15-eb8fc3fe6236-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "b7825dac-b973-4913-bf15-eb8fc3fe6236" (UID: "b7825dac-b973-4913-bf15-eb8fc3fe6236"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 24 10:13:01 crc kubenswrapper[4985]: I0224 10:13:01.027832 4985 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b7825dac-b973-4913-bf15-eb8fc3fe6236-kube-api-access-jk6xc" (OuterVolumeSpecName: "kube-api-access-jk6xc") pod "b7825dac-b973-4913-bf15-eb8fc3fe6236" (UID: "b7825dac-b973-4913-bf15-eb8fc3fe6236"). InnerVolumeSpecName "kube-api-access-jk6xc". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 24 10:13:01 crc kubenswrapper[4985]: I0224 10:13:01.126520 4985 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/13afe978-9654-487a-9742-cf5e2e5a1e00-proxy-ca-bundles\") pod \"controller-manager-5699f48c5-58p8c\" (UID: \"13afe978-9654-487a-9742-cf5e2e5a1e00\") " pod="openshift-controller-manager/controller-manager-5699f48c5-58p8c" Feb 24 10:13:01 crc kubenswrapper[4985]: I0224 10:13:01.127777 4985 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/13afe978-9654-487a-9742-cf5e2e5a1e00-serving-cert\") pod \"controller-manager-5699f48c5-58p8c\" (UID: \"13afe978-9654-487a-9742-cf5e2e5a1e00\") " pod="openshift-controller-manager/controller-manager-5699f48c5-58p8c" Feb 24 10:13:01 crc kubenswrapper[4985]: I0224 10:13:01.128036 4985 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/13afe978-9654-487a-9742-cf5e2e5a1e00-client-ca\") pod \"controller-manager-5699f48c5-58p8c\" (UID: \"13afe978-9654-487a-9742-cf5e2e5a1e00\") " pod="openshift-controller-manager/controller-manager-5699f48c5-58p8c" Feb 24 10:13:01 crc kubenswrapper[4985]: I0224 10:13:01.128313 4985 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/13afe978-9654-487a-9742-cf5e2e5a1e00-config\") pod \"controller-manager-5699f48c5-58p8c\" (UID: \"13afe978-9654-487a-9742-cf5e2e5a1e00\") " pod="openshift-controller-manager/controller-manager-5699f48c5-58p8c" Feb 24 10:13:01 crc kubenswrapper[4985]: I0224 10:13:01.128544 4985 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lt2h6\" (UniqueName: \"kubernetes.io/projected/13afe978-9654-487a-9742-cf5e2e5a1e00-kube-api-access-lt2h6\") pod \"controller-manager-5699f48c5-58p8c\" (UID: \"13afe978-9654-487a-9742-cf5e2e5a1e00\") " pod="openshift-controller-manager/controller-manager-5699f48c5-58p8c" Feb 24 10:13:01 crc kubenswrapper[4985]: I0224 10:13:01.128797 4985 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/b7825dac-b973-4913-bf15-eb8fc3fe6236-client-ca\") on node \"crc\" DevicePath \"\"" Feb 24 10:13:01 crc kubenswrapper[4985]: I0224 10:13:01.129001 4985 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b7825dac-b973-4913-bf15-eb8fc3fe6236-config\") on node \"crc\" DevicePath \"\"" Feb 24 10:13:01 crc kubenswrapper[4985]: I0224 10:13:01.129172 4985 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b7825dac-b973-4913-bf15-eb8fc3fe6236-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 24 10:13:01 crc kubenswrapper[4985]: I0224 10:13:01.129342 4985 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jk6xc\" (UniqueName: \"kubernetes.io/projected/b7825dac-b973-4913-bf15-eb8fc3fe6236-kube-api-access-jk6xc\") on node \"crc\" DevicePath \"\"" Feb 24 10:13:01 crc kubenswrapper[4985]: I0224 10:13:01.128552 4985 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/13afe978-9654-487a-9742-cf5e2e5a1e00-proxy-ca-bundles\") pod \"controller-manager-5699f48c5-58p8c\" (UID: \"13afe978-9654-487a-9742-cf5e2e5a1e00\") " pod="openshift-controller-manager/controller-manager-5699f48c5-58p8c" Feb 24 10:13:01 crc kubenswrapper[4985]: I0224 10:13:01.129075 4985 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/13afe978-9654-487a-9742-cf5e2e5a1e00-client-ca\") pod \"controller-manager-5699f48c5-58p8c\" (UID: \"13afe978-9654-487a-9742-cf5e2e5a1e00\") " pod="openshift-controller-manager/controller-manager-5699f48c5-58p8c" Feb 24 10:13:01 crc kubenswrapper[4985]: I0224 10:13:01.130557 4985 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/13afe978-9654-487a-9742-cf5e2e5a1e00-config\") pod \"controller-manager-5699f48c5-58p8c\" (UID: \"13afe978-9654-487a-9742-cf5e2e5a1e00\") " pod="openshift-controller-manager/controller-manager-5699f48c5-58p8c" Feb 24 10:13:01 crc kubenswrapper[4985]: I0224 10:13:01.132223 4985 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/13afe978-9654-487a-9742-cf5e2e5a1e00-serving-cert\") pod \"controller-manager-5699f48c5-58p8c\" (UID: \"13afe978-9654-487a-9742-cf5e2e5a1e00\") " pod="openshift-controller-manager/controller-manager-5699f48c5-58p8c" Feb 24 10:13:01 crc kubenswrapper[4985]: I0224 10:13:01.228689 4985 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-7fd5bb98f6-xk6s7"] Feb 24 10:13:01 crc kubenswrapper[4985]: I0224 10:13:01.232158 4985 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-7fd5bb98f6-xk6s7"] Feb 24 10:13:01 crc kubenswrapper[4985]: I0224 10:13:01.285332 4985 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lt2h6\" (UniqueName: \"kubernetes.io/projected/13afe978-9654-487a-9742-cf5e2e5a1e00-kube-api-access-lt2h6\") pod \"controller-manager-5699f48c5-58p8c\" (UID: \"13afe978-9654-487a-9742-cf5e2e5a1e00\") " pod="openshift-controller-manager/controller-manager-5699f48c5-58p8c" Feb 24 10:13:01 crc kubenswrapper[4985]: I0224 10:13:01.483850 4985 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 24 10:13:01 crc kubenswrapper[4985]: I0224 10:13:01.537150 4985 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5699f48c5-58p8c" Feb 24 10:13:02 crc kubenswrapper[4985]: I0224 10:13:02.276200 4985 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b7825dac-b973-4913-bf15-eb8fc3fe6236" path="/var/lib/kubelet/pods/b7825dac-b973-4913-bf15-eb8fc3fe6236/volumes" Feb 24 10:13:02 crc kubenswrapper[4985]: E0224 10:13:02.602427 4985 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-operator-index:v4.18" Feb 24 10:13:02 crc kubenswrapper[4985]: E0224 10:13:02.602614 4985 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-2psfq,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-operators-rhlvp_openshift-marketplace(2fd0e2bc-ed7e-4a38-b107-9217d349ad15): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Feb 24 10:13:02 crc kubenswrapper[4985]: E0224 10:13:02.603790 4985 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-operators-rhlvp" podUID="2fd0e2bc-ed7e-4a38-b107-9217d349ad15" Feb 24 10:13:02 crc kubenswrapper[4985]: E0224 10:13:02.731946 4985 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-operator-index:v4.18" Feb 24 10:13:02 crc kubenswrapper[4985]: E0224 10:13:02.732134 4985 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-wql8d,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-operators-v2cq6_openshift-marketplace(a6f03b61-4740-4538-877d-40390729b5ef): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Feb 24 10:13:02 crc kubenswrapper[4985]: E0224 10:13:02.733307 4985 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-operators-v2cq6" podUID="a6f03b61-4740-4538-877d-40390729b5ef" Feb 24 10:13:04 crc kubenswrapper[4985]: E0224 10:13:04.373387 4985 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-rhlvp" podUID="2fd0e2bc-ed7e-4a38-b107-9217d349ad15" Feb 24 10:13:04 crc kubenswrapper[4985]: E0224 10:13:04.373969 4985 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-v2cq6" podUID="a6f03b61-4740-4538-877d-40390729b5ef" Feb 24 10:13:04 crc kubenswrapper[4985]: I0224 10:13:04.465570 4985 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-ddbfd44ff-n5gs7" Feb 24 10:13:04 crc kubenswrapper[4985]: I0224 10:13:04.486947 4985 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rn87s\" (UniqueName: \"kubernetes.io/projected/86bcd109-4d8a-427f-9c07-1e88c6e1ea22-kube-api-access-rn87s\") pod \"86bcd109-4d8a-427f-9c07-1e88c6e1ea22\" (UID: \"86bcd109-4d8a-427f-9c07-1e88c6e1ea22\") " Feb 24 10:13:04 crc kubenswrapper[4985]: I0224 10:13:04.487068 4985 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/86bcd109-4d8a-427f-9c07-1e88c6e1ea22-config\") pod \"86bcd109-4d8a-427f-9c07-1e88c6e1ea22\" (UID: \"86bcd109-4d8a-427f-9c07-1e88c6e1ea22\") " Feb 24 10:13:04 crc kubenswrapper[4985]: I0224 10:13:04.487097 4985 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/86bcd109-4d8a-427f-9c07-1e88c6e1ea22-client-ca\") pod \"86bcd109-4d8a-427f-9c07-1e88c6e1ea22\" (UID: \"86bcd109-4d8a-427f-9c07-1e88c6e1ea22\") " Feb 24 10:13:04 crc kubenswrapper[4985]: I0224 10:13:04.487149 4985 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/86bcd109-4d8a-427f-9c07-1e88c6e1ea22-serving-cert\") pod \"86bcd109-4d8a-427f-9c07-1e88c6e1ea22\" (UID: \"86bcd109-4d8a-427f-9c07-1e88c6e1ea22\") " Feb 24 10:13:04 crc kubenswrapper[4985]: I0224 10:13:04.490301 4985 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/86bcd109-4d8a-427f-9c07-1e88c6e1ea22-client-ca" (OuterVolumeSpecName: "client-ca") pod "86bcd109-4d8a-427f-9c07-1e88c6e1ea22" (UID: "86bcd109-4d8a-427f-9c07-1e88c6e1ea22"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 24 10:13:04 crc kubenswrapper[4985]: I0224 10:13:04.491155 4985 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/86bcd109-4d8a-427f-9c07-1e88c6e1ea22-config" (OuterVolumeSpecName: "config") pod "86bcd109-4d8a-427f-9c07-1e88c6e1ea22" (UID: "86bcd109-4d8a-427f-9c07-1e88c6e1ea22"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 24 10:13:04 crc kubenswrapper[4985]: I0224 10:13:04.497587 4985 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/86bcd109-4d8a-427f-9c07-1e88c6e1ea22-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "86bcd109-4d8a-427f-9c07-1e88c6e1ea22" (UID: "86bcd109-4d8a-427f-9c07-1e88c6e1ea22"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 24 10:13:04 crc kubenswrapper[4985]: I0224 10:13:04.498046 4985 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/86bcd109-4d8a-427f-9c07-1e88c6e1ea22-kube-api-access-rn87s" (OuterVolumeSpecName: "kube-api-access-rn87s") pod "86bcd109-4d8a-427f-9c07-1e88c6e1ea22" (UID: "86bcd109-4d8a-427f-9c07-1e88c6e1ea22"). InnerVolumeSpecName "kube-api-access-rn87s". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 24 10:13:04 crc kubenswrapper[4985]: I0224 10:13:04.499716 4985 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-58f755ff7c-fvhz7"] Feb 24 10:13:04 crc kubenswrapper[4985]: E0224 10:13:04.499994 4985 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="86bcd109-4d8a-427f-9c07-1e88c6e1ea22" containerName="route-controller-manager" Feb 24 10:13:04 crc kubenswrapper[4985]: I0224 10:13:04.500017 4985 state_mem.go:107] "Deleted CPUSet assignment" podUID="86bcd109-4d8a-427f-9c07-1e88c6e1ea22" containerName="route-controller-manager" Feb 24 10:13:04 crc kubenswrapper[4985]: I0224 10:13:04.500135 4985 memory_manager.go:354] "RemoveStaleState removing state" podUID="86bcd109-4d8a-427f-9c07-1e88c6e1ea22" containerName="route-controller-manager" Feb 24 10:13:04 crc kubenswrapper[4985]: I0224 10:13:04.500605 4985 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-58f755ff7c-fvhz7" Feb 24 10:13:04 crc kubenswrapper[4985]: E0224 10:13:04.516953 4985 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Feb 24 10:13:04 crc kubenswrapper[4985]: E0224 10:13:04.517170 4985 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-jrwz6,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-l5p66_openshift-marketplace(5186b86d-7d8f-4ed5-b444-991eaf2a793e): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Feb 24 10:13:04 crc kubenswrapper[4985]: E0224 10:13:04.518751 4985 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-l5p66" podUID="5186b86d-7d8f-4ed5-b444-991eaf2a793e" Feb 24 10:13:04 crc kubenswrapper[4985]: I0224 10:13:04.520970 4985 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-58f755ff7c-fvhz7"] Feb 24 10:13:04 crc kubenswrapper[4985]: E0224 10:13:04.546707 4985 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Feb 24 10:13:04 crc kubenswrapper[4985]: E0224 10:13:04.546876 4985 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-4rqx4,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-msgzd_openshift-marketplace(720477c9-8e44-43cf-a9ba-5ac5b96fe65b): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Feb 24 10:13:04 crc kubenswrapper[4985]: E0224 10:13:04.548322 4985 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-msgzd" podUID="720477c9-8e44-43cf-a9ba-5ac5b96fe65b" Feb 24 10:13:04 crc kubenswrapper[4985]: I0224 10:13:04.588045 4985 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7dfb5a3d-1117-42e6-b4e3-59afe7960139-serving-cert\") pod \"route-controller-manager-58f755ff7c-fvhz7\" (UID: \"7dfb5a3d-1117-42e6-b4e3-59afe7960139\") " pod="openshift-route-controller-manager/route-controller-manager-58f755ff7c-fvhz7" Feb 24 10:13:04 crc kubenswrapper[4985]: I0224 10:13:04.588364 4985 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7dfb5a3d-1117-42e6-b4e3-59afe7960139-client-ca\") pod \"route-controller-manager-58f755ff7c-fvhz7\" (UID: \"7dfb5a3d-1117-42e6-b4e3-59afe7960139\") " pod="openshift-route-controller-manager/route-controller-manager-58f755ff7c-fvhz7" Feb 24 10:13:04 crc kubenswrapper[4985]: I0224 10:13:04.588400 4985 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-spssq\" (UniqueName: \"kubernetes.io/projected/7dfb5a3d-1117-42e6-b4e3-59afe7960139-kube-api-access-spssq\") pod \"route-controller-manager-58f755ff7c-fvhz7\" (UID: \"7dfb5a3d-1117-42e6-b4e3-59afe7960139\") " pod="openshift-route-controller-manager/route-controller-manager-58f755ff7c-fvhz7" Feb 24 10:13:04 crc kubenswrapper[4985]: I0224 10:13:04.588427 4985 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7dfb5a3d-1117-42e6-b4e3-59afe7960139-config\") pod \"route-controller-manager-58f755ff7c-fvhz7\" (UID: \"7dfb5a3d-1117-42e6-b4e3-59afe7960139\") " pod="openshift-route-controller-manager/route-controller-manager-58f755ff7c-fvhz7" Feb 24 10:13:04 crc kubenswrapper[4985]: I0224 10:13:04.588551 4985 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rn87s\" (UniqueName: \"kubernetes.io/projected/86bcd109-4d8a-427f-9c07-1e88c6e1ea22-kube-api-access-rn87s\") on node \"crc\" DevicePath \"\"" Feb 24 10:13:04 crc kubenswrapper[4985]: I0224 10:13:04.588590 4985 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/86bcd109-4d8a-427f-9c07-1e88c6e1ea22-config\") on node \"crc\" DevicePath \"\"" Feb 24 10:13:04 crc kubenswrapper[4985]: I0224 10:13:04.588602 4985 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/86bcd109-4d8a-427f-9c07-1e88c6e1ea22-client-ca\") on node \"crc\" DevicePath \"\"" Feb 24 10:13:04 crc kubenswrapper[4985]: I0224 10:13:04.588612 4985 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/86bcd109-4d8a-427f-9c07-1e88c6e1ea22-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 24 10:13:04 crc kubenswrapper[4985]: I0224 10:13:04.689530 4985 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7dfb5a3d-1117-42e6-b4e3-59afe7960139-client-ca\") pod \"route-controller-manager-58f755ff7c-fvhz7\" (UID: \"7dfb5a3d-1117-42e6-b4e3-59afe7960139\") " pod="openshift-route-controller-manager/route-controller-manager-58f755ff7c-fvhz7" Feb 24 10:13:04 crc kubenswrapper[4985]: I0224 10:13:04.689599 4985 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-spssq\" (UniqueName: \"kubernetes.io/projected/7dfb5a3d-1117-42e6-b4e3-59afe7960139-kube-api-access-spssq\") pod \"route-controller-manager-58f755ff7c-fvhz7\" (UID: \"7dfb5a3d-1117-42e6-b4e3-59afe7960139\") " pod="openshift-route-controller-manager/route-controller-manager-58f755ff7c-fvhz7" Feb 24 10:13:04 crc kubenswrapper[4985]: I0224 10:13:04.689634 4985 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7dfb5a3d-1117-42e6-b4e3-59afe7960139-config\") pod \"route-controller-manager-58f755ff7c-fvhz7\" (UID: \"7dfb5a3d-1117-42e6-b4e3-59afe7960139\") " pod="openshift-route-controller-manager/route-controller-manager-58f755ff7c-fvhz7" Feb 24 10:13:04 crc kubenswrapper[4985]: I0224 10:13:04.689665 4985 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7dfb5a3d-1117-42e6-b4e3-59afe7960139-serving-cert\") pod \"route-controller-manager-58f755ff7c-fvhz7\" (UID: \"7dfb5a3d-1117-42e6-b4e3-59afe7960139\") " pod="openshift-route-controller-manager/route-controller-manager-58f755ff7c-fvhz7" Feb 24 10:13:04 crc kubenswrapper[4985]: I0224 10:13:04.690811 4985 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7dfb5a3d-1117-42e6-b4e3-59afe7960139-client-ca\") pod \"route-controller-manager-58f755ff7c-fvhz7\" (UID: \"7dfb5a3d-1117-42e6-b4e3-59afe7960139\") " pod="openshift-route-controller-manager/route-controller-manager-58f755ff7c-fvhz7" Feb 24 10:13:04 crc kubenswrapper[4985]: I0224 10:13:04.690880 4985 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7dfb5a3d-1117-42e6-b4e3-59afe7960139-config\") pod \"route-controller-manager-58f755ff7c-fvhz7\" (UID: \"7dfb5a3d-1117-42e6-b4e3-59afe7960139\") " pod="openshift-route-controller-manager/route-controller-manager-58f755ff7c-fvhz7" Feb 24 10:13:04 crc kubenswrapper[4985]: I0224 10:13:04.695482 4985 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7dfb5a3d-1117-42e6-b4e3-59afe7960139-serving-cert\") pod \"route-controller-manager-58f755ff7c-fvhz7\" (UID: \"7dfb5a3d-1117-42e6-b4e3-59afe7960139\") " pod="openshift-route-controller-manager/route-controller-manager-58f755ff7c-fvhz7" Feb 24 10:13:04 crc kubenswrapper[4985]: I0224 10:13:04.706995 4985 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-spssq\" (UniqueName: \"kubernetes.io/projected/7dfb5a3d-1117-42e6-b4e3-59afe7960139-kube-api-access-spssq\") pod \"route-controller-manager-58f755ff7c-fvhz7\" (UID: \"7dfb5a3d-1117-42e6-b4e3-59afe7960139\") " pod="openshift-route-controller-manager/route-controller-manager-58f755ff7c-fvhz7" Feb 24 10:13:04 crc kubenswrapper[4985]: I0224 10:13:04.847016 4985 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-58f755ff7c-fvhz7" Feb 24 10:13:04 crc kubenswrapper[4985]: I0224 10:13:04.911092 4985 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-ddbfd44ff-n5gs7" event={"ID":"86bcd109-4d8a-427f-9c07-1e88c6e1ea22","Type":"ContainerDied","Data":"a67942c5969e081d5a0c4e407678a8ccecd962b6e84b232616bd30eaed529c2d"} Feb 24 10:13:04 crc kubenswrapper[4985]: I0224 10:13:04.911107 4985 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-ddbfd44ff-n5gs7" Feb 24 10:13:04 crc kubenswrapper[4985]: I0224 10:13:04.973764 4985 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-ddbfd44ff-n5gs7"] Feb 24 10:13:04 crc kubenswrapper[4985]: I0224 10:13:04.979867 4985 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-ddbfd44ff-n5gs7"] Feb 24 10:13:06 crc kubenswrapper[4985]: I0224 10:13:06.272574 4985 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="86bcd109-4d8a-427f-9c07-1e88c6e1ea22" path="/var/lib/kubelet/pods/86bcd109-4d8a-427f-9c07-1e88c6e1ea22/volumes" Feb 24 10:13:06 crc kubenswrapper[4985]: E0224 10:13:06.395523 4985 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-msgzd" podUID="720477c9-8e44-43cf-a9ba-5ac5b96fe65b" Feb 24 10:13:06 crc kubenswrapper[4985]: E0224 10:13:06.395569 4985 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-l5p66" podUID="5186b86d-7d8f-4ed5-b444-991eaf2a793e" Feb 24 10:13:06 crc kubenswrapper[4985]: I0224 10:13:06.433510 4985 scope.go:117] "RemoveContainer" containerID="1a04a9fa1abe62c3dd7a539a0b7bea961f22b45e056bbfd47c27122efcb4e73b" Feb 24 10:13:06 crc kubenswrapper[4985]: E0224 10:13:06.480213 4985 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Feb 24 10:13:06 crc kubenswrapper[4985]: E0224 10:13:06.480404 4985 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-49tf4,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-8k2ln_openshift-marketplace(49f9b37d-f90b-49ba-bb8d-bb34255c63c0): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Feb 24 10:13:06 crc kubenswrapper[4985]: E0224 10:13:06.485926 4985 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/certified-operators-8k2ln" podUID="49f9b37d-f90b-49ba-bb8d-bb34255c63c0" Feb 24 10:13:06 crc kubenswrapper[4985]: E0224 10:13:06.528310 4985 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Feb 24 10:13:06 crc kubenswrapper[4985]: E0224 10:13:06.528471 4985 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-mp2f6,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-lpjfw_openshift-marketplace(fef137fa-cf9a-4695-a0b5-3863ec2ea3bf): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Feb 24 10:13:06 crc kubenswrapper[4985]: E0224 10:13:06.533041 4985 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/certified-operators-lpjfw" podUID="fef137fa-cf9a-4695-a0b5-3863ec2ea3bf" Feb 24 10:13:06 crc kubenswrapper[4985]: I0224 10:13:06.749341 4985 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-58f755ff7c-fvhz7"] Feb 24 10:13:06 crc kubenswrapper[4985]: I0224 10:13:06.842612 4985 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-5699f48c5-58p8c"] Feb 24 10:13:06 crc kubenswrapper[4985]: I0224 10:13:06.877941 4985 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Feb 24 10:13:06 crc kubenswrapper[4985]: I0224 10:13:06.919497 4985 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Feb 24 10:13:06 crc kubenswrapper[4985]: I0224 10:13:06.931947 4985 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-5699f48c5-58p8c" event={"ID":"13afe978-9654-487a-9742-cf5e2e5a1e00","Type":"ContainerStarted","Data":"d2f640b045cc6c545d7b8b3fd4a6f3c635e46f26812729aaf0f35a58ce9ea4b5"} Feb 24 10:13:06 crc kubenswrapper[4985]: I0224 10:13:06.934836 4985 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-58f755ff7c-fvhz7" event={"ID":"7dfb5a3d-1117-42e6-b4e3-59afe7960139","Type":"ContainerStarted","Data":"1e5c074e134dc1eedf9f657c44df64d6feaef9d1739cef880f58df0a1677be73"} Feb 24 10:13:06 crc kubenswrapper[4985]: I0224 10:13:06.935405 4985 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-58f755ff7c-fvhz7" Feb 24 10:13:06 crc kubenswrapper[4985]: I0224 10:13:06.936819 4985 patch_prober.go:28] interesting pod/route-controller-manager-58f755ff7c-fvhz7 container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.60:8443/healthz\": dial tcp 10.217.0.60:8443: connect: connection refused" start-of-body= Feb 24 10:13:06 crc kubenswrapper[4985]: I0224 10:13:06.936859 4985 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-58f755ff7c-fvhz7" podUID="7dfb5a3d-1117-42e6-b4e3-59afe7960139" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.60:8443/healthz\": dial tcp 10.217.0.60:8443: connect: connection refused" Feb 24 10:13:06 crc kubenswrapper[4985]: I0224 10:13:06.939341 4985 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"6e3d6d7b-c88d-4966-8a11-915441e6b482","Type":"ContainerStarted","Data":"5d9ef0afd9f506a72c0e916f93b3d6a6f3d6cad0dc71a133f13bc28b52c9a22c"} Feb 24 10:13:06 crc kubenswrapper[4985]: E0224 10:13:06.943390 4985 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-8k2ln" podUID="49f9b37d-f90b-49ba-bb8d-bb34255c63c0" Feb 24 10:13:06 crc kubenswrapper[4985]: E0224 10:13:06.943540 4985 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-lpjfw" podUID="fef137fa-cf9a-4695-a0b5-3863ec2ea3bf" Feb 24 10:13:06 crc kubenswrapper[4985]: W0224 10:13:06.945556 4985 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod646342de_73ea_4a23_b414_66b824660b5e.slice/crio-4db9dd677c75dfdec8da418d9131c997060a31610c708448058dbf0ed07d7729 WatchSource:0}: Error finding container 4db9dd677c75dfdec8da418d9131c997060a31610c708448058dbf0ed07d7729: Status 404 returned error can't find the container with id 4db9dd677c75dfdec8da418d9131c997060a31610c708448058dbf0ed07d7729 Feb 24 10:13:06 crc kubenswrapper[4985]: I0224 10:13:06.959481 4985 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-58f755ff7c-fvhz7" podStartSLOduration=11.959461466 podStartE2EDuration="11.959461466s" podCreationTimestamp="2026-02-24 10:12:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-24 10:13:06.954252855 +0000 UTC m=+271.428445425" watchObservedRunningTime="2026-02-24 10:13:06.959461466 +0000 UTC m=+271.433654026" Feb 24 10:13:07 crc kubenswrapper[4985]: I0224 10:13:07.946580 4985 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-5699f48c5-58p8c" event={"ID":"13afe978-9654-487a-9742-cf5e2e5a1e00","Type":"ContainerStarted","Data":"7ba65d5fbdc9a5cf9412a25dd496582541cfdc139453e69bfd633a1079e5c4e4"} Feb 24 10:13:07 crc kubenswrapper[4985]: I0224 10:13:07.947240 4985 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-5699f48c5-58p8c" Feb 24 10:13:07 crc kubenswrapper[4985]: I0224 10:13:07.948899 4985 generic.go:334] "Generic (PLEG): container finished" podID="646342de-73ea-4a23-b414-66b824660b5e" containerID="45a912edd7b962fec32804e02d5add87a77a4fdb8026475d3e281974c6a7a095" exitCode=0 Feb 24 10:13:07 crc kubenswrapper[4985]: I0224 10:13:07.948977 4985 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"646342de-73ea-4a23-b414-66b824660b5e","Type":"ContainerDied","Data":"45a912edd7b962fec32804e02d5add87a77a4fdb8026475d3e281974c6a7a095"} Feb 24 10:13:07 crc kubenswrapper[4985]: I0224 10:13:07.949006 4985 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"646342de-73ea-4a23-b414-66b824660b5e","Type":"ContainerStarted","Data":"4db9dd677c75dfdec8da418d9131c997060a31610c708448058dbf0ed07d7729"} Feb 24 10:13:07 crc kubenswrapper[4985]: I0224 10:13:07.950829 4985 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-58f755ff7c-fvhz7" event={"ID":"7dfb5a3d-1117-42e6-b4e3-59afe7960139","Type":"ContainerStarted","Data":"2bf4f79eb78d609e138c285678e987df8b0e6c91a4326947ca68586510799cd1"} Feb 24 10:13:07 crc kubenswrapper[4985]: I0224 10:13:07.954966 4985 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"6e3d6d7b-c88d-4966-8a11-915441e6b482","Type":"ContainerStarted","Data":"2ff32f122e7404f1f9a41c457c304d0ef3740c19f6445a4412d1b481681df96d"} Feb 24 10:13:07 crc kubenswrapper[4985]: I0224 10:13:07.956687 4985 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-5699f48c5-58p8c" Feb 24 10:13:07 crc kubenswrapper[4985]: I0224 10:13:07.961075 4985 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-58f755ff7c-fvhz7" Feb 24 10:13:07 crc kubenswrapper[4985]: I0224 10:13:07.969002 4985 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-5699f48c5-58p8c" podStartSLOduration=12.968959991 podStartE2EDuration="12.968959991s" podCreationTimestamp="2026-02-24 10:12:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-24 10:13:07.966469499 +0000 UTC m=+272.440662069" watchObservedRunningTime="2026-02-24 10:13:07.968959991 +0000 UTC m=+272.443152561" Feb 24 10:13:07 crc kubenswrapper[4985]: I0224 10:13:07.982824 4985 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/installer-9-crc" podStartSLOduration=7.982495292 podStartE2EDuration="7.982495292s" podCreationTimestamp="2026-02-24 10:13:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-24 10:13:07.980500494 +0000 UTC m=+272.454693044" watchObservedRunningTime="2026-02-24 10:13:07.982495292 +0000 UTC m=+272.456687852" Feb 24 10:13:09 crc kubenswrapper[4985]: I0224 10:13:09.212059 4985 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 24 10:13:09 crc kubenswrapper[4985]: I0224 10:13:09.260812 4985 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/646342de-73ea-4a23-b414-66b824660b5e-kube-api-access\") pod \"646342de-73ea-4a23-b414-66b824660b5e\" (UID: \"646342de-73ea-4a23-b414-66b824660b5e\") " Feb 24 10:13:09 crc kubenswrapper[4985]: I0224 10:13:09.261605 4985 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/646342de-73ea-4a23-b414-66b824660b5e-kubelet-dir\") pod \"646342de-73ea-4a23-b414-66b824660b5e\" (UID: \"646342de-73ea-4a23-b414-66b824660b5e\") " Feb 24 10:13:09 crc kubenswrapper[4985]: I0224 10:13:09.261633 4985 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/646342de-73ea-4a23-b414-66b824660b5e-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "646342de-73ea-4a23-b414-66b824660b5e" (UID: "646342de-73ea-4a23-b414-66b824660b5e"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 24 10:13:09 crc kubenswrapper[4985]: I0224 10:13:09.261837 4985 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/646342de-73ea-4a23-b414-66b824660b5e-kubelet-dir\") on node \"crc\" DevicePath \"\"" Feb 24 10:13:09 crc kubenswrapper[4985]: I0224 10:13:09.266329 4985 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/646342de-73ea-4a23-b414-66b824660b5e-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "646342de-73ea-4a23-b414-66b824660b5e" (UID: "646342de-73ea-4a23-b414-66b824660b5e"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 24 10:13:09 crc kubenswrapper[4985]: I0224 10:13:09.363017 4985 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/646342de-73ea-4a23-b414-66b824660b5e-kube-api-access\") on node \"crc\" DevicePath \"\"" Feb 24 10:13:09 crc kubenswrapper[4985]: I0224 10:13:09.969911 4985 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"646342de-73ea-4a23-b414-66b824660b5e","Type":"ContainerDied","Data":"4db9dd677c75dfdec8da418d9131c997060a31610c708448058dbf0ed07d7729"} Feb 24 10:13:09 crc kubenswrapper[4985]: I0224 10:13:09.970290 4985 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4db9dd677c75dfdec8da418d9131c997060a31610c708448058dbf0ed07d7729" Feb 24 10:13:09 crc kubenswrapper[4985]: I0224 10:13:09.970017 4985 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 24 10:13:12 crc kubenswrapper[4985]: I0224 10:13:12.989539 4985 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ptllg" event={"ID":"d8f7b4e0-a896-4c14-bfea-fa6066bfd4c4","Type":"ContainerStarted","Data":"5441a26ac037a837c07fa4b99a47e59ed6b248f18309c29a5888aeebf0b65e8a"} Feb 24 10:13:13 crc kubenswrapper[4985]: I0224 10:13:13.238500 4985 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-xh5q8"] Feb 24 10:13:13 crc kubenswrapper[4985]: I0224 10:13:13.625207 4985 patch_prober.go:28] interesting pod/machine-config-daemon-hq52w container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 24 10:13:13 crc kubenswrapper[4985]: I0224 10:13:13.625439 4985 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hq52w" podUID="11c1c7b8-18df-4583-849f-76b62544344b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 24 10:13:13 crc kubenswrapper[4985]: I0224 10:13:13.625483 4985 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-hq52w" Feb 24 10:13:13 crc kubenswrapper[4985]: I0224 10:13:13.625923 4985 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"14cb7316040a9cb3cb3236ed16ecc17eee99c6935a646b86604ea8285e1aab0b"} pod="openshift-machine-config-operator/machine-config-daemon-hq52w" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 24 10:13:13 crc kubenswrapper[4985]: I0224 10:13:13.625967 4985 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-hq52w" podUID="11c1c7b8-18df-4583-849f-76b62544344b" containerName="machine-config-daemon" containerID="cri-o://14cb7316040a9cb3cb3236ed16ecc17eee99c6935a646b86604ea8285e1aab0b" gracePeriod=600 Feb 24 10:13:13 crc kubenswrapper[4985]: I0224 10:13:13.996473 4985 generic.go:334] "Generic (PLEG): container finished" podID="11c1c7b8-18df-4583-849f-76b62544344b" containerID="14cb7316040a9cb3cb3236ed16ecc17eee99c6935a646b86604ea8285e1aab0b" exitCode=0 Feb 24 10:13:13 crc kubenswrapper[4985]: I0224 10:13:13.996564 4985 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-hq52w" event={"ID":"11c1c7b8-18df-4583-849f-76b62544344b","Type":"ContainerDied","Data":"14cb7316040a9cb3cb3236ed16ecc17eee99c6935a646b86604ea8285e1aab0b"} Feb 24 10:13:13 crc kubenswrapper[4985]: I0224 10:13:13.996851 4985 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-hq52w" event={"ID":"11c1c7b8-18df-4583-849f-76b62544344b","Type":"ContainerStarted","Data":"af9e8809fa21abba0bc6f989ce1de36b1b356edb744ae8b075b66b3c7afc91af"} Feb 24 10:13:13 crc kubenswrapper[4985]: I0224 10:13:13.998581 4985 generic.go:334] "Generic (PLEG): container finished" podID="d8f7b4e0-a896-4c14-bfea-fa6066bfd4c4" containerID="5441a26ac037a837c07fa4b99a47e59ed6b248f18309c29a5888aeebf0b65e8a" exitCode=0 Feb 24 10:13:13 crc kubenswrapper[4985]: I0224 10:13:13.998611 4985 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ptllg" event={"ID":"d8f7b4e0-a896-4c14-bfea-fa6066bfd4c4","Type":"ContainerDied","Data":"5441a26ac037a837c07fa4b99a47e59ed6b248f18309c29a5888aeebf0b65e8a"} Feb 24 10:13:15 crc kubenswrapper[4985]: I0224 10:13:15.006684 4985 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ptllg" event={"ID":"d8f7b4e0-a896-4c14-bfea-fa6066bfd4c4","Type":"ContainerStarted","Data":"a694ca5350c8dd87df4e719af409ee5abaa11bfb56d6b48a81555580033949be"} Feb 24 10:13:15 crc kubenswrapper[4985]: I0224 10:13:15.029297 4985 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-ptllg" podStartSLOduration=2.215538098 podStartE2EDuration="54.029278531s" podCreationTimestamp="2026-02-24 10:12:21 +0000 UTC" firstStartedPulling="2026-02-24 10:12:22.7390107 +0000 UTC m=+227.213203260" lastFinishedPulling="2026-02-24 10:13:14.552751133 +0000 UTC m=+279.026943693" observedRunningTime="2026-02-24 10:13:15.024074872 +0000 UTC m=+279.498267432" watchObservedRunningTime="2026-02-24 10:13:15.029278531 +0000 UTC m=+279.503471101" Feb 24 10:13:18 crc kubenswrapper[4985]: I0224 10:13:18.023116 4985 generic.go:334] "Generic (PLEG): container finished" podID="cd5749ab-5662-41d6-9d8b-663bdd9b7a0b" containerID="ace5069341fc000f51a042d88790d0e024e879e745f830063cb3aeb897669cdf" exitCode=0 Feb 24 10:13:18 crc kubenswrapper[4985]: I0224 10:13:18.023187 4985 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-f9svt" event={"ID":"cd5749ab-5662-41d6-9d8b-663bdd9b7a0b","Type":"ContainerDied","Data":"ace5069341fc000f51a042d88790d0e024e879e745f830063cb3aeb897669cdf"} Feb 24 10:13:20 crc kubenswrapper[4985]: I0224 10:13:20.037752 4985 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-f9svt" event={"ID":"cd5749ab-5662-41d6-9d8b-663bdd9b7a0b","Type":"ContainerStarted","Data":"64c9a92fbeb5cc0bcff69d7245731bd930391eb44ccd78f3992cbe34cf3adac5"} Feb 24 10:13:20 crc kubenswrapper[4985]: I0224 10:13:20.057514 4985 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-f9svt" podStartSLOduration=2.928277383 podStartE2EDuration="59.057492613s" podCreationTimestamp="2026-02-24 10:12:21 +0000 UTC" firstStartedPulling="2026-02-24 10:12:22.733063129 +0000 UTC m=+227.207255689" lastFinishedPulling="2026-02-24 10:13:18.862278359 +0000 UTC m=+283.336470919" observedRunningTime="2026-02-24 10:13:20.057125912 +0000 UTC m=+284.531318472" watchObservedRunningTime="2026-02-24 10:13:20.057492613 +0000 UTC m=+284.531685173" Feb 24 10:13:21 crc kubenswrapper[4985]: I0224 10:13:21.046526 4985 generic.go:334] "Generic (PLEG): container finished" podID="a6f03b61-4740-4538-877d-40390729b5ef" containerID="e2d832c72145d54e22bfedf1645763669ff3bd6168af02fa6e2754aff107dc00" exitCode=0 Feb 24 10:13:21 crc kubenswrapper[4985]: I0224 10:13:21.046709 4985 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-v2cq6" event={"ID":"a6f03b61-4740-4538-877d-40390729b5ef","Type":"ContainerDied","Data":"e2d832c72145d54e22bfedf1645763669ff3bd6168af02fa6e2754aff107dc00"} Feb 24 10:13:21 crc kubenswrapper[4985]: I0224 10:13:21.048722 4985 generic.go:334] "Generic (PLEG): container finished" podID="2fd0e2bc-ed7e-4a38-b107-9217d349ad15" containerID="f97a4fe8d5def25f9a0ad14d50af81076a971b1a5ff9999faf958b7e0e497653" exitCode=0 Feb 24 10:13:21 crc kubenswrapper[4985]: I0224 10:13:21.048777 4985 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rhlvp" event={"ID":"2fd0e2bc-ed7e-4a38-b107-9217d349ad15","Type":"ContainerDied","Data":"f97a4fe8d5def25f9a0ad14d50af81076a971b1a5ff9999faf958b7e0e497653"} Feb 24 10:13:21 crc kubenswrapper[4985]: I0224 10:13:21.053677 4985 generic.go:334] "Generic (PLEG): container finished" podID="720477c9-8e44-43cf-a9ba-5ac5b96fe65b" containerID="a6f8be874c61afaba5b803b64dde25fcd4a99ff14ea0f1bed7a889c4a087a73b" exitCode=0 Feb 24 10:13:21 crc kubenswrapper[4985]: I0224 10:13:21.053731 4985 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-msgzd" event={"ID":"720477c9-8e44-43cf-a9ba-5ac5b96fe65b","Type":"ContainerDied","Data":"a6f8be874c61afaba5b803b64dde25fcd4a99ff14ea0f1bed7a889c4a087a73b"} Feb 24 10:13:21 crc kubenswrapper[4985]: I0224 10:13:21.542926 4985 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-ptllg" Feb 24 10:13:21 crc kubenswrapper[4985]: I0224 10:13:21.542977 4985 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-ptllg" Feb 24 10:13:21 crc kubenswrapper[4985]: I0224 10:13:21.679399 4985 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-ptllg" Feb 24 10:13:21 crc kubenswrapper[4985]: I0224 10:13:21.904056 4985 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-f9svt" Feb 24 10:13:21 crc kubenswrapper[4985]: I0224 10:13:21.904103 4985 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-f9svt" Feb 24 10:13:21 crc kubenswrapper[4985]: I0224 10:13:21.963098 4985 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-f9svt" Feb 24 10:13:22 crc kubenswrapper[4985]: I0224 10:13:22.062938 4985 generic.go:334] "Generic (PLEG): container finished" podID="49f9b37d-f90b-49ba-bb8d-bb34255c63c0" containerID="5e5a4e5b12835c383b62f45f3756cc9dd16cbb0eff1ea1b5d5155c71c651e122" exitCode=0 Feb 24 10:13:22 crc kubenswrapper[4985]: I0224 10:13:22.063010 4985 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8k2ln" event={"ID":"49f9b37d-f90b-49ba-bb8d-bb34255c63c0","Type":"ContainerDied","Data":"5e5a4e5b12835c383b62f45f3756cc9dd16cbb0eff1ea1b5d5155c71c651e122"} Feb 24 10:13:22 crc kubenswrapper[4985]: I0224 10:13:22.073060 4985 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-v2cq6" event={"ID":"a6f03b61-4740-4538-877d-40390729b5ef","Type":"ContainerStarted","Data":"24d5fb7c3610b477753a3660bb6aad57e44fb5f8d886fccf378049f070945acf"} Feb 24 10:13:22 crc kubenswrapper[4985]: I0224 10:13:22.081296 4985 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rhlvp" event={"ID":"2fd0e2bc-ed7e-4a38-b107-9217d349ad15","Type":"ContainerStarted","Data":"09e4a5fca1f0926f07129896234ee5043d7c6da9fd3c83fd99e821c6979a7f1e"} Feb 24 10:13:22 crc kubenswrapper[4985]: I0224 10:13:22.084775 4985 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-l5p66" event={"ID":"5186b86d-7d8f-4ed5-b444-991eaf2a793e","Type":"ContainerStarted","Data":"53ac39259d9a89039900e4398d48a67fb894ffcce97799faa2fde54b8ed04874"} Feb 24 10:13:22 crc kubenswrapper[4985]: I0224 10:13:22.101162 4985 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-msgzd" event={"ID":"720477c9-8e44-43cf-a9ba-5ac5b96fe65b","Type":"ContainerStarted","Data":"9dea687a6ab33bc660735e007170e9e6e56d95cb0fd2eb33c1dbeca5d60b15a3"} Feb 24 10:13:22 crc kubenswrapper[4985]: I0224 10:13:22.117167 4985 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-v2cq6" podStartSLOduration=3.329780809 podStartE2EDuration="1m0.117147387s" podCreationTimestamp="2026-02-24 10:12:22 +0000 UTC" firstStartedPulling="2026-02-24 10:12:24.780679936 +0000 UTC m=+229.254872496" lastFinishedPulling="2026-02-24 10:13:21.568046514 +0000 UTC m=+286.042239074" observedRunningTime="2026-02-24 10:13:22.114861641 +0000 UTC m=+286.589054201" watchObservedRunningTime="2026-02-24 10:13:22.117147387 +0000 UTC m=+286.591339947" Feb 24 10:13:22 crc kubenswrapper[4985]: I0224 10:13:22.133604 4985 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-rhlvp" podStartSLOduration=2.471640253 podStartE2EDuration="1m0.133584811s" podCreationTimestamp="2026-02-24 10:12:22 +0000 UTC" firstStartedPulling="2026-02-24 10:12:23.762167874 +0000 UTC m=+228.236360434" lastFinishedPulling="2026-02-24 10:13:21.424112432 +0000 UTC m=+285.898304992" observedRunningTime="2026-02-24 10:13:22.133156479 +0000 UTC m=+286.607349039" watchObservedRunningTime="2026-02-24 10:13:22.133584811 +0000 UTC m=+286.607777371" Feb 24 10:13:22 crc kubenswrapper[4985]: I0224 10:13:22.168542 4985 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-ptllg" Feb 24 10:13:22 crc kubenswrapper[4985]: I0224 10:13:22.201020 4985 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-msgzd" podStartSLOduration=3.326663721 podStartE2EDuration="1m3.201003097s" podCreationTimestamp="2026-02-24 10:12:19 +0000 UTC" firstStartedPulling="2026-02-24 10:12:21.659009318 +0000 UTC m=+226.133201878" lastFinishedPulling="2026-02-24 10:13:21.533348694 +0000 UTC m=+286.007541254" observedRunningTime="2026-02-24 10:13:22.181296037 +0000 UTC m=+286.655488597" watchObservedRunningTime="2026-02-24 10:13:22.201003097 +0000 UTC m=+286.675195657" Feb 24 10:13:22 crc kubenswrapper[4985]: I0224 10:13:22.699915 4985 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-rhlvp" Feb 24 10:13:22 crc kubenswrapper[4985]: I0224 10:13:22.700325 4985 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-rhlvp" Feb 24 10:13:23 crc kubenswrapper[4985]: I0224 10:13:23.106523 4985 generic.go:334] "Generic (PLEG): container finished" podID="5186b86d-7d8f-4ed5-b444-991eaf2a793e" containerID="53ac39259d9a89039900e4398d48a67fb894ffcce97799faa2fde54b8ed04874" exitCode=0 Feb 24 10:13:23 crc kubenswrapper[4985]: I0224 10:13:23.106623 4985 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-l5p66" event={"ID":"5186b86d-7d8f-4ed5-b444-991eaf2a793e","Type":"ContainerDied","Data":"53ac39259d9a89039900e4398d48a67fb894ffcce97799faa2fde54b8ed04874"} Feb 24 10:13:23 crc kubenswrapper[4985]: I0224 10:13:23.112315 4985 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8k2ln" event={"ID":"49f9b37d-f90b-49ba-bb8d-bb34255c63c0","Type":"ContainerStarted","Data":"ff64fe42248851cbaec0afa0bd914ca330a30adb7ab6f8ce246034f3cefc7931"} Feb 24 10:13:23 crc kubenswrapper[4985]: I0224 10:13:23.152785 4985 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-8k2ln" podStartSLOduration=2.158595439 podStartE2EDuration="1m4.152763616s" podCreationTimestamp="2026-02-24 10:12:19 +0000 UTC" firstStartedPulling="2026-02-24 10:12:20.545192471 +0000 UTC m=+225.019385051" lastFinishedPulling="2026-02-24 10:13:22.539360668 +0000 UTC m=+287.013553228" observedRunningTime="2026-02-24 10:13:23.149548233 +0000 UTC m=+287.623740813" watchObservedRunningTime="2026-02-24 10:13:23.152763616 +0000 UTC m=+287.626956176" Feb 24 10:13:23 crc kubenswrapper[4985]: I0224 10:13:23.200366 4985 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-v2cq6" Feb 24 10:13:23 crc kubenswrapper[4985]: I0224 10:13:23.200428 4985 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-v2cq6" Feb 24 10:13:23 crc kubenswrapper[4985]: I0224 10:13:23.734300 4985 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-rhlvp" podUID="2fd0e2bc-ed7e-4a38-b107-9217d349ad15" containerName="registry-server" probeResult="failure" output=< Feb 24 10:13:23 crc kubenswrapper[4985]: timeout: failed to connect service ":50051" within 1s Feb 24 10:13:23 crc kubenswrapper[4985]: > Feb 24 10:13:24 crc kubenswrapper[4985]: I0224 10:13:24.121006 4985 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-l5p66" event={"ID":"5186b86d-7d8f-4ed5-b444-991eaf2a793e","Type":"ContainerStarted","Data":"c7fe22917b7d142d3fe274699e2f1f5138e0263bfd36f6a94aff95ad95eb8a5a"} Feb 24 10:13:24 crc kubenswrapper[4985]: I0224 10:13:24.122935 4985 generic.go:334] "Generic (PLEG): container finished" podID="fef137fa-cf9a-4695-a0b5-3863ec2ea3bf" containerID="4862e7f12d77b140f7ab5f7af947a0014e846f221caf8476a6ca906b53069d69" exitCode=0 Feb 24 10:13:24 crc kubenswrapper[4985]: I0224 10:13:24.122976 4985 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-lpjfw" event={"ID":"fef137fa-cf9a-4695-a0b5-3863ec2ea3bf","Type":"ContainerDied","Data":"4862e7f12d77b140f7ab5f7af947a0014e846f221caf8476a6ca906b53069d69"} Feb 24 10:13:24 crc kubenswrapper[4985]: I0224 10:13:24.141798 4985 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-l5p66" podStartSLOduration=1.907249945 podStartE2EDuration="1m5.141781801s" podCreationTimestamp="2026-02-24 10:12:19 +0000 UTC" firstStartedPulling="2026-02-24 10:12:20.555067543 +0000 UTC m=+225.029260123" lastFinishedPulling="2026-02-24 10:13:23.789599419 +0000 UTC m=+288.263791979" observedRunningTime="2026-02-24 10:13:24.139038092 +0000 UTC m=+288.613230652" watchObservedRunningTime="2026-02-24 10:13:24.141781801 +0000 UTC m=+288.615974361" Feb 24 10:13:24 crc kubenswrapper[4985]: I0224 10:13:24.242511 4985 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-v2cq6" podUID="a6f03b61-4740-4538-877d-40390729b5ef" containerName="registry-server" probeResult="failure" output=< Feb 24 10:13:24 crc kubenswrapper[4985]: timeout: failed to connect service ":50051" within 1s Feb 24 10:13:24 crc kubenswrapper[4985]: > Feb 24 10:13:25 crc kubenswrapper[4985]: I0224 10:13:25.137275 4985 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-lpjfw" event={"ID":"fef137fa-cf9a-4695-a0b5-3863ec2ea3bf","Type":"ContainerStarted","Data":"922820f8a470e60197605565a1e12d8e604ff0a2b2c2a237f5cfad8571533d65"} Feb 24 10:13:25 crc kubenswrapper[4985]: I0224 10:13:25.161053 4985 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-lpjfw" podStartSLOduration=2.195828186 podStartE2EDuration="1m6.161033728s" podCreationTimestamp="2026-02-24 10:12:19 +0000 UTC" firstStartedPulling="2026-02-24 10:12:20.546830491 +0000 UTC m=+225.021023051" lastFinishedPulling="2026-02-24 10:13:24.512036033 +0000 UTC m=+288.986228593" observedRunningTime="2026-02-24 10:13:25.15972971 +0000 UTC m=+289.633922290" watchObservedRunningTime="2026-02-24 10:13:25.161033728 +0000 UTC m=+289.635226288" Feb 24 10:13:25 crc kubenswrapper[4985]: I0224 10:13:25.648612 4985 ???:1] "http: TLS handshake error from 192.168.126.11:57780: no serving certificate available for the kubelet" Feb 24 10:13:29 crc kubenswrapper[4985]: I0224 10:13:29.507393 4985 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-lpjfw" Feb 24 10:13:29 crc kubenswrapper[4985]: I0224 10:13:29.507714 4985 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-lpjfw" Feb 24 10:13:29 crc kubenswrapper[4985]: I0224 10:13:29.541606 4985 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-lpjfw" Feb 24 10:13:29 crc kubenswrapper[4985]: I0224 10:13:29.712709 4985 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-l5p66" Feb 24 10:13:29 crc kubenswrapper[4985]: I0224 10:13:29.712781 4985 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-l5p66" Feb 24 10:13:29 crc kubenswrapper[4985]: I0224 10:13:29.750382 4985 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-l5p66" Feb 24 10:13:29 crc kubenswrapper[4985]: I0224 10:13:29.898272 4985 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-8k2ln" Feb 24 10:13:29 crc kubenswrapper[4985]: I0224 10:13:29.898317 4985 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-8k2ln" Feb 24 10:13:29 crc kubenswrapper[4985]: I0224 10:13:29.945066 4985 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-8k2ln" Feb 24 10:13:30 crc kubenswrapper[4985]: I0224 10:13:30.106823 4985 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-msgzd" Feb 24 10:13:30 crc kubenswrapper[4985]: I0224 10:13:30.106867 4985 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-msgzd" Feb 24 10:13:30 crc kubenswrapper[4985]: I0224 10:13:30.140150 4985 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-msgzd" Feb 24 10:13:30 crc kubenswrapper[4985]: I0224 10:13:30.199571 4985 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-lpjfw" Feb 24 10:13:30 crc kubenswrapper[4985]: I0224 10:13:30.204403 4985 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-l5p66" Feb 24 10:13:30 crc kubenswrapper[4985]: I0224 10:13:30.205756 4985 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-8k2ln" Feb 24 10:13:30 crc kubenswrapper[4985]: I0224 10:13:30.220623 4985 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-msgzd" Feb 24 10:13:30 crc kubenswrapper[4985]: I0224 10:13:30.892028 4985 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-msgzd"] Feb 24 10:13:31 crc kubenswrapper[4985]: I0224 10:13:31.940875 4985 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-f9svt" Feb 24 10:13:32 crc kubenswrapper[4985]: I0224 10:13:32.171777 4985 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-msgzd" podUID="720477c9-8e44-43cf-a9ba-5ac5b96fe65b" containerName="registry-server" containerID="cri-o://9dea687a6ab33bc660735e007170e9e6e56d95cb0fd2eb33c1dbeca5d60b15a3" gracePeriod=2 Feb 24 10:13:32 crc kubenswrapper[4985]: I0224 10:13:32.293243 4985 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-8k2ln"] Feb 24 10:13:32 crc kubenswrapper[4985]: I0224 10:13:32.293458 4985 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-8k2ln" podUID="49f9b37d-f90b-49ba-bb8d-bb34255c63c0" containerName="registry-server" containerID="cri-o://ff64fe42248851cbaec0afa0bd914ca330a30adb7ab6f8ce246034f3cefc7931" gracePeriod=2 Feb 24 10:13:32 crc kubenswrapper[4985]: I0224 10:13:32.734588 4985 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-rhlvp" Feb 24 10:13:32 crc kubenswrapper[4985]: I0224 10:13:32.769365 4985 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-rhlvp" Feb 24 10:13:33 crc kubenswrapper[4985]: I0224 10:13:33.177662 4985 generic.go:334] "Generic (PLEG): container finished" podID="720477c9-8e44-43cf-a9ba-5ac5b96fe65b" containerID="9dea687a6ab33bc660735e007170e9e6e56d95cb0fd2eb33c1dbeca5d60b15a3" exitCode=0 Feb 24 10:13:33 crc kubenswrapper[4985]: I0224 10:13:33.177729 4985 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-msgzd" event={"ID":"720477c9-8e44-43cf-a9ba-5ac5b96fe65b","Type":"ContainerDied","Data":"9dea687a6ab33bc660735e007170e9e6e56d95cb0fd2eb33c1dbeca5d60b15a3"} Feb 24 10:13:33 crc kubenswrapper[4985]: I0224 10:13:33.239389 4985 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-v2cq6" Feb 24 10:13:33 crc kubenswrapper[4985]: I0224 10:13:33.279563 4985 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-v2cq6" Feb 24 10:13:34 crc kubenswrapper[4985]: I0224 10:13:34.193420 4985 generic.go:334] "Generic (PLEG): container finished" podID="49f9b37d-f90b-49ba-bb8d-bb34255c63c0" containerID="ff64fe42248851cbaec0afa0bd914ca330a30adb7ab6f8ce246034f3cefc7931" exitCode=0 Feb 24 10:13:34 crc kubenswrapper[4985]: I0224 10:13:34.194721 4985 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8k2ln" event={"ID":"49f9b37d-f90b-49ba-bb8d-bb34255c63c0","Type":"ContainerDied","Data":"ff64fe42248851cbaec0afa0bd914ca330a30adb7ab6f8ce246034f3cefc7931"} Feb 24 10:13:34 crc kubenswrapper[4985]: I0224 10:13:34.222808 4985 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-msgzd" Feb 24 10:13:34 crc kubenswrapper[4985]: I0224 10:13:34.386174 4985 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/720477c9-8e44-43cf-a9ba-5ac5b96fe65b-catalog-content\") pod \"720477c9-8e44-43cf-a9ba-5ac5b96fe65b\" (UID: \"720477c9-8e44-43cf-a9ba-5ac5b96fe65b\") " Feb 24 10:13:34 crc kubenswrapper[4985]: I0224 10:13:34.386515 4985 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/720477c9-8e44-43cf-a9ba-5ac5b96fe65b-utilities\") pod \"720477c9-8e44-43cf-a9ba-5ac5b96fe65b\" (UID: \"720477c9-8e44-43cf-a9ba-5ac5b96fe65b\") " Feb 24 10:13:34 crc kubenswrapper[4985]: I0224 10:13:34.386553 4985 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4rqx4\" (UniqueName: \"kubernetes.io/projected/720477c9-8e44-43cf-a9ba-5ac5b96fe65b-kube-api-access-4rqx4\") pod \"720477c9-8e44-43cf-a9ba-5ac5b96fe65b\" (UID: \"720477c9-8e44-43cf-a9ba-5ac5b96fe65b\") " Feb 24 10:13:34 crc kubenswrapper[4985]: I0224 10:13:34.387216 4985 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/720477c9-8e44-43cf-a9ba-5ac5b96fe65b-utilities" (OuterVolumeSpecName: "utilities") pod "720477c9-8e44-43cf-a9ba-5ac5b96fe65b" (UID: "720477c9-8e44-43cf-a9ba-5ac5b96fe65b"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 24 10:13:34 crc kubenswrapper[4985]: I0224 10:13:34.397156 4985 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/720477c9-8e44-43cf-a9ba-5ac5b96fe65b-kube-api-access-4rqx4" (OuterVolumeSpecName: "kube-api-access-4rqx4") pod "720477c9-8e44-43cf-a9ba-5ac5b96fe65b" (UID: "720477c9-8e44-43cf-a9ba-5ac5b96fe65b"). InnerVolumeSpecName "kube-api-access-4rqx4". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 24 10:13:34 crc kubenswrapper[4985]: I0224 10:13:34.444877 4985 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/720477c9-8e44-43cf-a9ba-5ac5b96fe65b-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "720477c9-8e44-43cf-a9ba-5ac5b96fe65b" (UID: "720477c9-8e44-43cf-a9ba-5ac5b96fe65b"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 24 10:13:34 crc kubenswrapper[4985]: I0224 10:13:34.488283 4985 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/720477c9-8e44-43cf-a9ba-5ac5b96fe65b-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 24 10:13:34 crc kubenswrapper[4985]: I0224 10:13:34.488320 4985 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/720477c9-8e44-43cf-a9ba-5ac5b96fe65b-utilities\") on node \"crc\" DevicePath \"\"" Feb 24 10:13:34 crc kubenswrapper[4985]: I0224 10:13:34.488330 4985 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4rqx4\" (UniqueName: \"kubernetes.io/projected/720477c9-8e44-43cf-a9ba-5ac5b96fe65b-kube-api-access-4rqx4\") on node \"crc\" DevicePath \"\"" Feb 24 10:13:34 crc kubenswrapper[4985]: I0224 10:13:34.585083 4985 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-8k2ln" Feb 24 10:13:34 crc kubenswrapper[4985]: I0224 10:13:34.690600 4985 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-f9svt"] Feb 24 10:13:34 crc kubenswrapper[4985]: I0224 10:13:34.690827 4985 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-f9svt" podUID="cd5749ab-5662-41d6-9d8b-663bdd9b7a0b" containerName="registry-server" containerID="cri-o://64c9a92fbeb5cc0bcff69d7245731bd930391eb44ccd78f3992cbe34cf3adac5" gracePeriod=2 Feb 24 10:13:34 crc kubenswrapper[4985]: I0224 10:13:34.690868 4985 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/49f9b37d-f90b-49ba-bb8d-bb34255c63c0-catalog-content\") pod \"49f9b37d-f90b-49ba-bb8d-bb34255c63c0\" (UID: \"49f9b37d-f90b-49ba-bb8d-bb34255c63c0\") " Feb 24 10:13:34 crc kubenswrapper[4985]: I0224 10:13:34.690924 4985 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-49tf4\" (UniqueName: \"kubernetes.io/projected/49f9b37d-f90b-49ba-bb8d-bb34255c63c0-kube-api-access-49tf4\") pod \"49f9b37d-f90b-49ba-bb8d-bb34255c63c0\" (UID: \"49f9b37d-f90b-49ba-bb8d-bb34255c63c0\") " Feb 24 10:13:34 crc kubenswrapper[4985]: I0224 10:13:34.690996 4985 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/49f9b37d-f90b-49ba-bb8d-bb34255c63c0-utilities\") pod \"49f9b37d-f90b-49ba-bb8d-bb34255c63c0\" (UID: \"49f9b37d-f90b-49ba-bb8d-bb34255c63c0\") " Feb 24 10:13:34 crc kubenswrapper[4985]: I0224 10:13:34.691730 4985 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/49f9b37d-f90b-49ba-bb8d-bb34255c63c0-utilities" (OuterVolumeSpecName: "utilities") pod "49f9b37d-f90b-49ba-bb8d-bb34255c63c0" (UID: "49f9b37d-f90b-49ba-bb8d-bb34255c63c0"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 24 10:13:34 crc kubenswrapper[4985]: I0224 10:13:34.695134 4985 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49f9b37d-f90b-49ba-bb8d-bb34255c63c0-kube-api-access-49tf4" (OuterVolumeSpecName: "kube-api-access-49tf4") pod "49f9b37d-f90b-49ba-bb8d-bb34255c63c0" (UID: "49f9b37d-f90b-49ba-bb8d-bb34255c63c0"). InnerVolumeSpecName "kube-api-access-49tf4". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 24 10:13:34 crc kubenswrapper[4985]: I0224 10:13:34.792708 4985 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-49tf4\" (UniqueName: \"kubernetes.io/projected/49f9b37d-f90b-49ba-bb8d-bb34255c63c0-kube-api-access-49tf4\") on node \"crc\" DevicePath \"\"" Feb 24 10:13:34 crc kubenswrapper[4985]: I0224 10:13:34.792750 4985 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/49f9b37d-f90b-49ba-bb8d-bb34255c63c0-utilities\") on node \"crc\" DevicePath \"\"" Feb 24 10:13:34 crc kubenswrapper[4985]: I0224 10:13:34.836786 4985 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/49f9b37d-f90b-49ba-bb8d-bb34255c63c0-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "49f9b37d-f90b-49ba-bb8d-bb34255c63c0" (UID: "49f9b37d-f90b-49ba-bb8d-bb34255c63c0"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 24 10:13:34 crc kubenswrapper[4985]: I0224 10:13:34.893615 4985 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/49f9b37d-f90b-49ba-bb8d-bb34255c63c0-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 24 10:13:35 crc kubenswrapper[4985]: I0224 10:13:35.200153 4985 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-msgzd" event={"ID":"720477c9-8e44-43cf-a9ba-5ac5b96fe65b","Type":"ContainerDied","Data":"a1ce1f4c5e55dfee91352b4d210f10ae503899dc99eeeb8950de305d48c8f541"} Feb 24 10:13:35 crc kubenswrapper[4985]: I0224 10:13:35.200165 4985 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-msgzd" Feb 24 10:13:35 crc kubenswrapper[4985]: I0224 10:13:35.200221 4985 scope.go:117] "RemoveContainer" containerID="9dea687a6ab33bc660735e007170e9e6e56d95cb0fd2eb33c1dbeca5d60b15a3" Feb 24 10:13:35 crc kubenswrapper[4985]: I0224 10:13:35.202440 4985 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8k2ln" event={"ID":"49f9b37d-f90b-49ba-bb8d-bb34255c63c0","Type":"ContainerDied","Data":"2d4b218c86ae22c435eab27cb0afca6521b0fd7b35d2817159e919ecf4296500"} Feb 24 10:13:35 crc kubenswrapper[4985]: I0224 10:13:35.202521 4985 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-8k2ln" Feb 24 10:13:35 crc kubenswrapper[4985]: I0224 10:13:35.223556 4985 scope.go:117] "RemoveContainer" containerID="a6f8be874c61afaba5b803b64dde25fcd4a99ff14ea0f1bed7a889c4a087a73b" Feb 24 10:13:35 crc kubenswrapper[4985]: I0224 10:13:35.235449 4985 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-msgzd"] Feb 24 10:13:35 crc kubenswrapper[4985]: I0224 10:13:35.239721 4985 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-msgzd"] Feb 24 10:13:35 crc kubenswrapper[4985]: I0224 10:13:35.247924 4985 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-5699f48c5-58p8c"] Feb 24 10:13:35 crc kubenswrapper[4985]: I0224 10:13:35.248626 4985 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-5699f48c5-58p8c" podUID="13afe978-9654-487a-9742-cf5e2e5a1e00" containerName="controller-manager" containerID="cri-o://7ba65d5fbdc9a5cf9412a25dd496582541cfdc139453e69bfd633a1079e5c4e4" gracePeriod=30 Feb 24 10:13:35 crc kubenswrapper[4985]: I0224 10:13:35.249136 4985 scope.go:117] "RemoveContainer" containerID="e981e007a7ea0fce0be9468e1f9e9a588f85c1cfb6a3e4c227ef26fb6c6e1be8" Feb 24 10:13:35 crc kubenswrapper[4985]: I0224 10:13:35.255709 4985 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-8k2ln"] Feb 24 10:13:35 crc kubenswrapper[4985]: I0224 10:13:35.263933 4985 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-8k2ln"] Feb 24 10:13:35 crc kubenswrapper[4985]: I0224 10:13:35.283589 4985 scope.go:117] "RemoveContainer" containerID="ff64fe42248851cbaec0afa0bd914ca330a30adb7ab6f8ce246034f3cefc7931" Feb 24 10:13:35 crc kubenswrapper[4985]: I0224 10:13:35.298858 4985 scope.go:117] "RemoveContainer" containerID="5e5a4e5b12835c383b62f45f3756cc9dd16cbb0eff1ea1b5d5155c71c651e122" Feb 24 10:13:35 crc kubenswrapper[4985]: I0224 10:13:35.314104 4985 scope.go:117] "RemoveContainer" containerID="08ca263dc71769f8f24ce540adc52274a5a7dc92e3be740cd8fd1080cab15145" Feb 24 10:13:35 crc kubenswrapper[4985]: I0224 10:13:35.346705 4985 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-58f755ff7c-fvhz7"] Feb 24 10:13:35 crc kubenswrapper[4985]: I0224 10:13:35.346979 4985 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-58f755ff7c-fvhz7" podUID="7dfb5a3d-1117-42e6-b4e3-59afe7960139" containerName="route-controller-manager" containerID="cri-o://2bf4f79eb78d609e138c285678e987df8b0e6c91a4326947ca68586510799cd1" gracePeriod=30 Feb 24 10:13:36 crc kubenswrapper[4985]: I0224 10:13:36.214400 4985 generic.go:334] "Generic (PLEG): container finished" podID="13afe978-9654-487a-9742-cf5e2e5a1e00" containerID="7ba65d5fbdc9a5cf9412a25dd496582541cfdc139453e69bfd633a1079e5c4e4" exitCode=0 Feb 24 10:13:36 crc kubenswrapper[4985]: I0224 10:13:36.214612 4985 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-5699f48c5-58p8c" event={"ID":"13afe978-9654-487a-9742-cf5e2e5a1e00","Type":"ContainerDied","Data":"7ba65d5fbdc9a5cf9412a25dd496582541cfdc139453e69bfd633a1079e5c4e4"} Feb 24 10:13:36 crc kubenswrapper[4985]: I0224 10:13:36.223263 4985 generic.go:334] "Generic (PLEG): container finished" podID="cd5749ab-5662-41d6-9d8b-663bdd9b7a0b" containerID="64c9a92fbeb5cc0bcff69d7245731bd930391eb44ccd78f3992cbe34cf3adac5" exitCode=0 Feb 24 10:13:36 crc kubenswrapper[4985]: I0224 10:13:36.223394 4985 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-f9svt" event={"ID":"cd5749ab-5662-41d6-9d8b-663bdd9b7a0b","Type":"ContainerDied","Data":"64c9a92fbeb5cc0bcff69d7245731bd930391eb44ccd78f3992cbe34cf3adac5"} Feb 24 10:13:36 crc kubenswrapper[4985]: I0224 10:13:36.274461 4985 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49f9b37d-f90b-49ba-bb8d-bb34255c63c0" path="/var/lib/kubelet/pods/49f9b37d-f90b-49ba-bb8d-bb34255c63c0/volumes" Feb 24 10:13:36 crc kubenswrapper[4985]: I0224 10:13:36.275742 4985 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="720477c9-8e44-43cf-a9ba-5ac5b96fe65b" path="/var/lib/kubelet/pods/720477c9-8e44-43cf-a9ba-5ac5b96fe65b/volumes" Feb 24 10:13:36 crc kubenswrapper[4985]: I0224 10:13:36.677863 4985 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-f9svt" Feb 24 10:13:36 crc kubenswrapper[4985]: I0224 10:13:36.720051 4985 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cd5749ab-5662-41d6-9d8b-663bdd9b7a0b-catalog-content\") pod \"cd5749ab-5662-41d6-9d8b-663bdd9b7a0b\" (UID: \"cd5749ab-5662-41d6-9d8b-663bdd9b7a0b\") " Feb 24 10:13:36 crc kubenswrapper[4985]: I0224 10:13:36.720140 4985 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cd5749ab-5662-41d6-9d8b-663bdd9b7a0b-utilities\") pod \"cd5749ab-5662-41d6-9d8b-663bdd9b7a0b\" (UID: \"cd5749ab-5662-41d6-9d8b-663bdd9b7a0b\") " Feb 24 10:13:36 crc kubenswrapper[4985]: I0224 10:13:36.720225 4985 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7qpdq\" (UniqueName: \"kubernetes.io/projected/cd5749ab-5662-41d6-9d8b-663bdd9b7a0b-kube-api-access-7qpdq\") pod \"cd5749ab-5662-41d6-9d8b-663bdd9b7a0b\" (UID: \"cd5749ab-5662-41d6-9d8b-663bdd9b7a0b\") " Feb 24 10:13:36 crc kubenswrapper[4985]: I0224 10:13:36.721184 4985 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cd5749ab-5662-41d6-9d8b-663bdd9b7a0b-utilities" (OuterVolumeSpecName: "utilities") pod "cd5749ab-5662-41d6-9d8b-663bdd9b7a0b" (UID: "cd5749ab-5662-41d6-9d8b-663bdd9b7a0b"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 24 10:13:36 crc kubenswrapper[4985]: I0224 10:13:36.724834 4985 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd5749ab-5662-41d6-9d8b-663bdd9b7a0b-kube-api-access-7qpdq" (OuterVolumeSpecName: "kube-api-access-7qpdq") pod "cd5749ab-5662-41d6-9d8b-663bdd9b7a0b" (UID: "cd5749ab-5662-41d6-9d8b-663bdd9b7a0b"). InnerVolumeSpecName "kube-api-access-7qpdq". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 24 10:13:36 crc kubenswrapper[4985]: I0224 10:13:36.744793 4985 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cd5749ab-5662-41d6-9d8b-663bdd9b7a0b-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "cd5749ab-5662-41d6-9d8b-663bdd9b7a0b" (UID: "cd5749ab-5662-41d6-9d8b-663bdd9b7a0b"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 24 10:13:36 crc kubenswrapper[4985]: I0224 10:13:36.822072 4985 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7qpdq\" (UniqueName: \"kubernetes.io/projected/cd5749ab-5662-41d6-9d8b-663bdd9b7a0b-kube-api-access-7qpdq\") on node \"crc\" DevicePath \"\"" Feb 24 10:13:36 crc kubenswrapper[4985]: I0224 10:13:36.822124 4985 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cd5749ab-5662-41d6-9d8b-663bdd9b7a0b-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 24 10:13:36 crc kubenswrapper[4985]: I0224 10:13:36.822138 4985 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cd5749ab-5662-41d6-9d8b-663bdd9b7a0b-utilities\") on node \"crc\" DevicePath \"\"" Feb 24 10:13:37 crc kubenswrapper[4985]: I0224 10:13:37.093323 4985 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-v2cq6"] Feb 24 10:13:37 crc kubenswrapper[4985]: I0224 10:13:37.093548 4985 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-v2cq6" podUID="a6f03b61-4740-4538-877d-40390729b5ef" containerName="registry-server" containerID="cri-o://24d5fb7c3610b477753a3660bb6aad57e44fb5f8d886fccf378049f070945acf" gracePeriod=2 Feb 24 10:13:37 crc kubenswrapper[4985]: I0224 10:13:37.234508 4985 generic.go:334] "Generic (PLEG): container finished" podID="7dfb5a3d-1117-42e6-b4e3-59afe7960139" containerID="2bf4f79eb78d609e138c285678e987df8b0e6c91a4326947ca68586510799cd1" exitCode=0 Feb 24 10:13:37 crc kubenswrapper[4985]: I0224 10:13:37.234703 4985 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-58f755ff7c-fvhz7" event={"ID":"7dfb5a3d-1117-42e6-b4e3-59afe7960139","Type":"ContainerDied","Data":"2bf4f79eb78d609e138c285678e987df8b0e6c91a4326947ca68586510799cd1"} Feb 24 10:13:37 crc kubenswrapper[4985]: I0224 10:13:37.237293 4985 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-f9svt" event={"ID":"cd5749ab-5662-41d6-9d8b-663bdd9b7a0b","Type":"ContainerDied","Data":"29db3a9b7b1869e00ae0d87b80d8ad197bbbdc4ede4cb6af006d8ebb1e569eb1"} Feb 24 10:13:37 crc kubenswrapper[4985]: I0224 10:13:37.237352 4985 scope.go:117] "RemoveContainer" containerID="64c9a92fbeb5cc0bcff69d7245731bd930391eb44ccd78f3992cbe34cf3adac5" Feb 24 10:13:37 crc kubenswrapper[4985]: I0224 10:13:37.237441 4985 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-f9svt" Feb 24 10:13:37 crc kubenswrapper[4985]: I0224 10:13:37.252704 4985 scope.go:117] "RemoveContainer" containerID="ace5069341fc000f51a042d88790d0e024e879e745f830063cb3aeb897669cdf" Feb 24 10:13:37 crc kubenswrapper[4985]: I0224 10:13:37.270325 4985 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-f9svt"] Feb 24 10:13:37 crc kubenswrapper[4985]: I0224 10:13:37.272750 4985 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-f9svt"] Feb 24 10:13:37 crc kubenswrapper[4985]: I0224 10:13:37.283419 4985 scope.go:117] "RemoveContainer" containerID="08043856d6e575b46ad8acdc69e420559f84d0f9606f18f6c8c1a6ba27ec67f4" Feb 24 10:13:37 crc kubenswrapper[4985]: I0224 10:13:37.424956 4985 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-58f755ff7c-fvhz7" Feb 24 10:13:37 crc kubenswrapper[4985]: I0224 10:13:37.428157 4985 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-spssq\" (UniqueName: \"kubernetes.io/projected/7dfb5a3d-1117-42e6-b4e3-59afe7960139-kube-api-access-spssq\") pod \"7dfb5a3d-1117-42e6-b4e3-59afe7960139\" (UID: \"7dfb5a3d-1117-42e6-b4e3-59afe7960139\") " Feb 24 10:13:37 crc kubenswrapper[4985]: I0224 10:13:37.428242 4985 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7dfb5a3d-1117-42e6-b4e3-59afe7960139-config\") pod \"7dfb5a3d-1117-42e6-b4e3-59afe7960139\" (UID: \"7dfb5a3d-1117-42e6-b4e3-59afe7960139\") " Feb 24 10:13:37 crc kubenswrapper[4985]: I0224 10:13:37.428272 4985 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7dfb5a3d-1117-42e6-b4e3-59afe7960139-client-ca\") pod \"7dfb5a3d-1117-42e6-b4e3-59afe7960139\" (UID: \"7dfb5a3d-1117-42e6-b4e3-59afe7960139\") " Feb 24 10:13:37 crc kubenswrapper[4985]: I0224 10:13:37.428302 4985 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7dfb5a3d-1117-42e6-b4e3-59afe7960139-serving-cert\") pod \"7dfb5a3d-1117-42e6-b4e3-59afe7960139\" (UID: \"7dfb5a3d-1117-42e6-b4e3-59afe7960139\") " Feb 24 10:13:37 crc kubenswrapper[4985]: I0224 10:13:37.429286 4985 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7dfb5a3d-1117-42e6-b4e3-59afe7960139-client-ca" (OuterVolumeSpecName: "client-ca") pod "7dfb5a3d-1117-42e6-b4e3-59afe7960139" (UID: "7dfb5a3d-1117-42e6-b4e3-59afe7960139"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 24 10:13:37 crc kubenswrapper[4985]: I0224 10:13:37.429301 4985 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7dfb5a3d-1117-42e6-b4e3-59afe7960139-config" (OuterVolumeSpecName: "config") pod "7dfb5a3d-1117-42e6-b4e3-59afe7960139" (UID: "7dfb5a3d-1117-42e6-b4e3-59afe7960139"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 24 10:13:37 crc kubenswrapper[4985]: I0224 10:13:37.432990 4985 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7dfb5a3d-1117-42e6-b4e3-59afe7960139-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7dfb5a3d-1117-42e6-b4e3-59afe7960139" (UID: "7dfb5a3d-1117-42e6-b4e3-59afe7960139"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 24 10:13:37 crc kubenswrapper[4985]: I0224 10:13:37.433194 4985 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7dfb5a3d-1117-42e6-b4e3-59afe7960139-kube-api-access-spssq" (OuterVolumeSpecName: "kube-api-access-spssq") pod "7dfb5a3d-1117-42e6-b4e3-59afe7960139" (UID: "7dfb5a3d-1117-42e6-b4e3-59afe7960139"). InnerVolumeSpecName "kube-api-access-spssq". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 24 10:13:37 crc kubenswrapper[4985]: I0224 10:13:37.449123 4985 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5699f48c5-58p8c" Feb 24 10:13:37 crc kubenswrapper[4985]: I0224 10:13:37.529008 4985 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-spssq\" (UniqueName: \"kubernetes.io/projected/7dfb5a3d-1117-42e6-b4e3-59afe7960139-kube-api-access-spssq\") on node \"crc\" DevicePath \"\"" Feb 24 10:13:37 crc kubenswrapper[4985]: I0224 10:13:37.529054 4985 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7dfb5a3d-1117-42e6-b4e3-59afe7960139-config\") on node \"crc\" DevicePath \"\"" Feb 24 10:13:37 crc kubenswrapper[4985]: I0224 10:13:37.529067 4985 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7dfb5a3d-1117-42e6-b4e3-59afe7960139-client-ca\") on node \"crc\" DevicePath \"\"" Feb 24 10:13:37 crc kubenswrapper[4985]: I0224 10:13:37.529079 4985 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7dfb5a3d-1117-42e6-b4e3-59afe7960139-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 24 10:13:37 crc kubenswrapper[4985]: I0224 10:13:37.629824 4985 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/13afe978-9654-487a-9742-cf5e2e5a1e00-serving-cert\") pod \"13afe978-9654-487a-9742-cf5e2e5a1e00\" (UID: \"13afe978-9654-487a-9742-cf5e2e5a1e00\") " Feb 24 10:13:37 crc kubenswrapper[4985]: I0224 10:13:37.629869 4985 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/13afe978-9654-487a-9742-cf5e2e5a1e00-client-ca\") pod \"13afe978-9654-487a-9742-cf5e2e5a1e00\" (UID: \"13afe978-9654-487a-9742-cf5e2e5a1e00\") " Feb 24 10:13:37 crc kubenswrapper[4985]: I0224 10:13:37.629921 4985 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lt2h6\" (UniqueName: \"kubernetes.io/projected/13afe978-9654-487a-9742-cf5e2e5a1e00-kube-api-access-lt2h6\") pod \"13afe978-9654-487a-9742-cf5e2e5a1e00\" (UID: \"13afe978-9654-487a-9742-cf5e2e5a1e00\") " Feb 24 10:13:37 crc kubenswrapper[4985]: I0224 10:13:37.629949 4985 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/13afe978-9654-487a-9742-cf5e2e5a1e00-config\") pod \"13afe978-9654-487a-9742-cf5e2e5a1e00\" (UID: \"13afe978-9654-487a-9742-cf5e2e5a1e00\") " Feb 24 10:13:37 crc kubenswrapper[4985]: I0224 10:13:37.629979 4985 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/13afe978-9654-487a-9742-cf5e2e5a1e00-proxy-ca-bundles\") pod \"13afe978-9654-487a-9742-cf5e2e5a1e00\" (UID: \"13afe978-9654-487a-9742-cf5e2e5a1e00\") " Feb 24 10:13:37 crc kubenswrapper[4985]: I0224 10:13:37.630646 4985 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/13afe978-9654-487a-9742-cf5e2e5a1e00-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "13afe978-9654-487a-9742-cf5e2e5a1e00" (UID: "13afe978-9654-487a-9742-cf5e2e5a1e00"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 24 10:13:37 crc kubenswrapper[4985]: I0224 10:13:37.630737 4985 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/13afe978-9654-487a-9742-cf5e2e5a1e00-client-ca" (OuterVolumeSpecName: "client-ca") pod "13afe978-9654-487a-9742-cf5e2e5a1e00" (UID: "13afe978-9654-487a-9742-cf5e2e5a1e00"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 24 10:13:37 crc kubenswrapper[4985]: I0224 10:13:37.630940 4985 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/13afe978-9654-487a-9742-cf5e2e5a1e00-config" (OuterVolumeSpecName: "config") pod "13afe978-9654-487a-9742-cf5e2e5a1e00" (UID: "13afe978-9654-487a-9742-cf5e2e5a1e00"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 24 10:13:37 crc kubenswrapper[4985]: I0224 10:13:37.633858 4985 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/13afe978-9654-487a-9742-cf5e2e5a1e00-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "13afe978-9654-487a-9742-cf5e2e5a1e00" (UID: "13afe978-9654-487a-9742-cf5e2e5a1e00"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 24 10:13:37 crc kubenswrapper[4985]: I0224 10:13:37.633865 4985 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/13afe978-9654-487a-9742-cf5e2e5a1e00-kube-api-access-lt2h6" (OuterVolumeSpecName: "kube-api-access-lt2h6") pod "13afe978-9654-487a-9742-cf5e2e5a1e00" (UID: "13afe978-9654-487a-9742-cf5e2e5a1e00"). InnerVolumeSpecName "kube-api-access-lt2h6". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 24 10:13:37 crc kubenswrapper[4985]: I0224 10:13:37.730953 4985 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/13afe978-9654-487a-9742-cf5e2e5a1e00-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 24 10:13:37 crc kubenswrapper[4985]: I0224 10:13:37.730998 4985 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/13afe978-9654-487a-9742-cf5e2e5a1e00-client-ca\") on node \"crc\" DevicePath \"\"" Feb 24 10:13:37 crc kubenswrapper[4985]: I0224 10:13:37.731011 4985 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lt2h6\" (UniqueName: \"kubernetes.io/projected/13afe978-9654-487a-9742-cf5e2e5a1e00-kube-api-access-lt2h6\") on node \"crc\" DevicePath \"\"" Feb 24 10:13:37 crc kubenswrapper[4985]: I0224 10:13:37.731024 4985 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/13afe978-9654-487a-9742-cf5e2e5a1e00-config\") on node \"crc\" DevicePath \"\"" Feb 24 10:13:37 crc kubenswrapper[4985]: I0224 10:13:37.731036 4985 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/13afe978-9654-487a-9742-cf5e2e5a1e00-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Feb 24 10:13:38 crc kubenswrapper[4985]: I0224 10:13:38.231830 4985 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-v2cq6" Feb 24 10:13:38 crc kubenswrapper[4985]: I0224 10:13:38.244266 4985 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-5699f48c5-58p8c" event={"ID":"13afe978-9654-487a-9742-cf5e2e5a1e00","Type":"ContainerDied","Data":"d2f640b045cc6c545d7b8b3fd4a6f3c635e46f26812729aaf0f35a58ce9ea4b5"} Feb 24 10:13:38 crc kubenswrapper[4985]: I0224 10:13:38.244284 4985 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5699f48c5-58p8c" Feb 24 10:13:38 crc kubenswrapper[4985]: I0224 10:13:38.244312 4985 scope.go:117] "RemoveContainer" containerID="7ba65d5fbdc9a5cf9412a25dd496582541cfdc139453e69bfd633a1079e5c4e4" Feb 24 10:13:38 crc kubenswrapper[4985]: I0224 10:13:38.253391 4985 generic.go:334] "Generic (PLEG): container finished" podID="a6f03b61-4740-4538-877d-40390729b5ef" containerID="24d5fb7c3610b477753a3660bb6aad57e44fb5f8d886fccf378049f070945acf" exitCode=0 Feb 24 10:13:38 crc kubenswrapper[4985]: I0224 10:13:38.253449 4985 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-v2cq6" event={"ID":"a6f03b61-4740-4538-877d-40390729b5ef","Type":"ContainerDied","Data":"24d5fb7c3610b477753a3660bb6aad57e44fb5f8d886fccf378049f070945acf"} Feb 24 10:13:38 crc kubenswrapper[4985]: I0224 10:13:38.253468 4985 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-v2cq6" event={"ID":"a6f03b61-4740-4538-877d-40390729b5ef","Type":"ContainerDied","Data":"94a0453cf9b4945e1e2166655d10c7aebba6d3d5505aa15dcccaff46e7379aeb"} Feb 24 10:13:38 crc kubenswrapper[4985]: I0224 10:13:38.253519 4985 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-v2cq6" Feb 24 10:13:38 crc kubenswrapper[4985]: I0224 10:13:38.255327 4985 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-58f755ff7c-fvhz7" event={"ID":"7dfb5a3d-1117-42e6-b4e3-59afe7960139","Type":"ContainerDied","Data":"1e5c074e134dc1eedf9f657c44df64d6feaef9d1739cef880f58df0a1677be73"} Feb 24 10:13:38 crc kubenswrapper[4985]: I0224 10:13:38.255383 4985 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-58f755ff7c-fvhz7" Feb 24 10:13:38 crc kubenswrapper[4985]: I0224 10:13:38.263185 4985 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-authentication/oauth-openshift-558db77b4-xh5q8" podUID="ee57ec8e-3901-4355-b744-5ed2eeb20c9d" containerName="oauth-openshift" containerID="cri-o://93977d6aab2f8e5fb39fed069172045bef4276f3311667c0c21484be2d8e881a" gracePeriod=15 Feb 24 10:13:38 crc kubenswrapper[4985]: I0224 10:13:38.263292 4985 scope.go:117] "RemoveContainer" containerID="24d5fb7c3610b477753a3660bb6aad57e44fb5f8d886fccf378049f070945acf" Feb 24 10:13:38 crc kubenswrapper[4985]: I0224 10:13:38.279192 4985 scope.go:117] "RemoveContainer" containerID="e2d832c72145d54e22bfedf1645763669ff3bd6168af02fa6e2754aff107dc00" Feb 24 10:13:38 crc kubenswrapper[4985]: I0224 10:13:38.280338 4985 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cd5749ab-5662-41d6-9d8b-663bdd9b7a0b" path="/var/lib/kubelet/pods/cd5749ab-5662-41d6-9d8b-663bdd9b7a0b/volumes" Feb 24 10:13:38 crc kubenswrapper[4985]: I0224 10:13:38.310692 4985 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-5699f48c5-58p8c"] Feb 24 10:13:38 crc kubenswrapper[4985]: I0224 10:13:38.313845 4985 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-5699f48c5-58p8c"] Feb 24 10:13:38 crc kubenswrapper[4985]: I0224 10:13:38.318359 4985 scope.go:117] "RemoveContainer" containerID="799b1a36fdc0a1c36335330dddcbc49da92cc70753f7ca5090cb4cd4270aaf86" Feb 24 10:13:38 crc kubenswrapper[4985]: I0224 10:13:38.333426 4985 scope.go:117] "RemoveContainer" containerID="24d5fb7c3610b477753a3660bb6aad57e44fb5f8d886fccf378049f070945acf" Feb 24 10:13:38 crc kubenswrapper[4985]: E0224 10:13:38.333858 4985 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"24d5fb7c3610b477753a3660bb6aad57e44fb5f8d886fccf378049f070945acf\": container with ID starting with 24d5fb7c3610b477753a3660bb6aad57e44fb5f8d886fccf378049f070945acf not found: ID does not exist" containerID="24d5fb7c3610b477753a3660bb6aad57e44fb5f8d886fccf378049f070945acf" Feb 24 10:13:38 crc kubenswrapper[4985]: I0224 10:13:38.333925 4985 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"24d5fb7c3610b477753a3660bb6aad57e44fb5f8d886fccf378049f070945acf"} err="failed to get container status \"24d5fb7c3610b477753a3660bb6aad57e44fb5f8d886fccf378049f070945acf\": rpc error: code = NotFound desc = could not find container \"24d5fb7c3610b477753a3660bb6aad57e44fb5f8d886fccf378049f070945acf\": container with ID starting with 24d5fb7c3610b477753a3660bb6aad57e44fb5f8d886fccf378049f070945acf not found: ID does not exist" Feb 24 10:13:38 crc kubenswrapper[4985]: I0224 10:13:38.333957 4985 scope.go:117] "RemoveContainer" containerID="e2d832c72145d54e22bfedf1645763669ff3bd6168af02fa6e2754aff107dc00" Feb 24 10:13:38 crc kubenswrapper[4985]: E0224 10:13:38.334286 4985 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e2d832c72145d54e22bfedf1645763669ff3bd6168af02fa6e2754aff107dc00\": container with ID starting with e2d832c72145d54e22bfedf1645763669ff3bd6168af02fa6e2754aff107dc00 not found: ID does not exist" containerID="e2d832c72145d54e22bfedf1645763669ff3bd6168af02fa6e2754aff107dc00" Feb 24 10:13:38 crc kubenswrapper[4985]: I0224 10:13:38.334329 4985 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e2d832c72145d54e22bfedf1645763669ff3bd6168af02fa6e2754aff107dc00"} err="failed to get container status \"e2d832c72145d54e22bfedf1645763669ff3bd6168af02fa6e2754aff107dc00\": rpc error: code = NotFound desc = could not find container \"e2d832c72145d54e22bfedf1645763669ff3bd6168af02fa6e2754aff107dc00\": container with ID starting with e2d832c72145d54e22bfedf1645763669ff3bd6168af02fa6e2754aff107dc00 not found: ID does not exist" Feb 24 10:13:38 crc kubenswrapper[4985]: I0224 10:13:38.334361 4985 scope.go:117] "RemoveContainer" containerID="799b1a36fdc0a1c36335330dddcbc49da92cc70753f7ca5090cb4cd4270aaf86" Feb 24 10:13:38 crc kubenswrapper[4985]: E0224 10:13:38.334588 4985 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"799b1a36fdc0a1c36335330dddcbc49da92cc70753f7ca5090cb4cd4270aaf86\": container with ID starting with 799b1a36fdc0a1c36335330dddcbc49da92cc70753f7ca5090cb4cd4270aaf86 not found: ID does not exist" containerID="799b1a36fdc0a1c36335330dddcbc49da92cc70753f7ca5090cb4cd4270aaf86" Feb 24 10:13:38 crc kubenswrapper[4985]: I0224 10:13:38.334610 4985 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"799b1a36fdc0a1c36335330dddcbc49da92cc70753f7ca5090cb4cd4270aaf86"} err="failed to get container status \"799b1a36fdc0a1c36335330dddcbc49da92cc70753f7ca5090cb4cd4270aaf86\": rpc error: code = NotFound desc = could not find container \"799b1a36fdc0a1c36335330dddcbc49da92cc70753f7ca5090cb4cd4270aaf86\": container with ID starting with 799b1a36fdc0a1c36335330dddcbc49da92cc70753f7ca5090cb4cd4270aaf86 not found: ID does not exist" Feb 24 10:13:38 crc kubenswrapper[4985]: I0224 10:13:38.334622 4985 scope.go:117] "RemoveContainer" containerID="2bf4f79eb78d609e138c285678e987df8b0e6c91a4326947ca68586510799cd1" Feb 24 10:13:38 crc kubenswrapper[4985]: I0224 10:13:38.336702 4985 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wql8d\" (UniqueName: \"kubernetes.io/projected/a6f03b61-4740-4538-877d-40390729b5ef-kube-api-access-wql8d\") pod \"a6f03b61-4740-4538-877d-40390729b5ef\" (UID: \"a6f03b61-4740-4538-877d-40390729b5ef\") " Feb 24 10:13:38 crc kubenswrapper[4985]: I0224 10:13:38.336764 4985 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a6f03b61-4740-4538-877d-40390729b5ef-utilities\") pod \"a6f03b61-4740-4538-877d-40390729b5ef\" (UID: \"a6f03b61-4740-4538-877d-40390729b5ef\") " Feb 24 10:13:38 crc kubenswrapper[4985]: I0224 10:13:38.337240 4985 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a6f03b61-4740-4538-877d-40390729b5ef-catalog-content\") pod \"a6f03b61-4740-4538-877d-40390729b5ef\" (UID: \"a6f03b61-4740-4538-877d-40390729b5ef\") " Feb 24 10:13:38 crc kubenswrapper[4985]: I0224 10:13:38.337831 4985 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a6f03b61-4740-4538-877d-40390729b5ef-utilities" (OuterVolumeSpecName: "utilities") pod "a6f03b61-4740-4538-877d-40390729b5ef" (UID: "a6f03b61-4740-4538-877d-40390729b5ef"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 24 10:13:38 crc kubenswrapper[4985]: I0224 10:13:38.338270 4985 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a6f03b61-4740-4538-877d-40390729b5ef-utilities\") on node \"crc\" DevicePath \"\"" Feb 24 10:13:38 crc kubenswrapper[4985]: I0224 10:13:38.341234 4985 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a6f03b61-4740-4538-877d-40390729b5ef-kube-api-access-wql8d" (OuterVolumeSpecName: "kube-api-access-wql8d") pod "a6f03b61-4740-4538-877d-40390729b5ef" (UID: "a6f03b61-4740-4538-877d-40390729b5ef"). InnerVolumeSpecName "kube-api-access-wql8d". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 24 10:13:38 crc kubenswrapper[4985]: I0224 10:13:38.438840 4985 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wql8d\" (UniqueName: \"kubernetes.io/projected/a6f03b61-4740-4538-877d-40390729b5ef-kube-api-access-wql8d\") on node \"crc\" DevicePath \"\"" Feb 24 10:13:38 crc kubenswrapper[4985]: I0224 10:13:38.597716 4985 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a6f03b61-4740-4538-877d-40390729b5ef-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "a6f03b61-4740-4538-877d-40390729b5ef" (UID: "a6f03b61-4740-4538-877d-40390729b5ef"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 24 10:13:38 crc kubenswrapper[4985]: I0224 10:13:38.641021 4985 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a6f03b61-4740-4538-877d-40390729b5ef-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 24 10:13:38 crc kubenswrapper[4985]: I0224 10:13:38.887980 4985 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-v2cq6"] Feb 24 10:13:38 crc kubenswrapper[4985]: I0224 10:13:38.898177 4985 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-v2cq6"] Feb 24 10:13:39 crc kubenswrapper[4985]: I0224 10:13:39.024778 4985 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-b99b87784-mpfxr"] Feb 24 10:13:39 crc kubenswrapper[4985]: E0224 10:13:39.025655 4985 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="13afe978-9654-487a-9742-cf5e2e5a1e00" containerName="controller-manager" Feb 24 10:13:39 crc kubenswrapper[4985]: I0224 10:13:39.025674 4985 state_mem.go:107] "Deleted CPUSet assignment" podUID="13afe978-9654-487a-9742-cf5e2e5a1e00" containerName="controller-manager" Feb 24 10:13:39 crc kubenswrapper[4985]: E0224 10:13:39.025687 4985 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="720477c9-8e44-43cf-a9ba-5ac5b96fe65b" containerName="extract-utilities" Feb 24 10:13:39 crc kubenswrapper[4985]: I0224 10:13:39.025695 4985 state_mem.go:107] "Deleted CPUSet assignment" podUID="720477c9-8e44-43cf-a9ba-5ac5b96fe65b" containerName="extract-utilities" Feb 24 10:13:39 crc kubenswrapper[4985]: E0224 10:13:39.025707 4985 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="49f9b37d-f90b-49ba-bb8d-bb34255c63c0" containerName="registry-server" Feb 24 10:13:39 crc kubenswrapper[4985]: I0224 10:13:39.025718 4985 state_mem.go:107] "Deleted CPUSet assignment" podUID="49f9b37d-f90b-49ba-bb8d-bb34255c63c0" containerName="registry-server" Feb 24 10:13:39 crc kubenswrapper[4985]: E0224 10:13:39.025731 4985 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7dfb5a3d-1117-42e6-b4e3-59afe7960139" containerName="route-controller-manager" Feb 24 10:13:39 crc kubenswrapper[4985]: I0224 10:13:39.025739 4985 state_mem.go:107] "Deleted CPUSet assignment" podUID="7dfb5a3d-1117-42e6-b4e3-59afe7960139" containerName="route-controller-manager" Feb 24 10:13:39 crc kubenswrapper[4985]: E0224 10:13:39.025748 4985 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a6f03b61-4740-4538-877d-40390729b5ef" containerName="registry-server" Feb 24 10:13:39 crc kubenswrapper[4985]: I0224 10:13:39.025756 4985 state_mem.go:107] "Deleted CPUSet assignment" podUID="a6f03b61-4740-4538-877d-40390729b5ef" containerName="registry-server" Feb 24 10:13:39 crc kubenswrapper[4985]: E0224 10:13:39.025766 4985 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="49f9b37d-f90b-49ba-bb8d-bb34255c63c0" containerName="extract-utilities" Feb 24 10:13:39 crc kubenswrapper[4985]: I0224 10:13:39.025771 4985 state_mem.go:107] "Deleted CPUSet assignment" podUID="49f9b37d-f90b-49ba-bb8d-bb34255c63c0" containerName="extract-utilities" Feb 24 10:13:39 crc kubenswrapper[4985]: E0224 10:13:39.025778 4985 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cd5749ab-5662-41d6-9d8b-663bdd9b7a0b" containerName="extract-content" Feb 24 10:13:39 crc kubenswrapper[4985]: I0224 10:13:39.025785 4985 state_mem.go:107] "Deleted CPUSet assignment" podUID="cd5749ab-5662-41d6-9d8b-663bdd9b7a0b" containerName="extract-content" Feb 24 10:13:39 crc kubenswrapper[4985]: E0224 10:13:39.025792 4985 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a6f03b61-4740-4538-877d-40390729b5ef" containerName="extract-content" Feb 24 10:13:39 crc kubenswrapper[4985]: I0224 10:13:39.025799 4985 state_mem.go:107] "Deleted CPUSet assignment" podUID="a6f03b61-4740-4538-877d-40390729b5ef" containerName="extract-content" Feb 24 10:13:39 crc kubenswrapper[4985]: E0224 10:13:39.025806 4985 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cd5749ab-5662-41d6-9d8b-663bdd9b7a0b" containerName="extract-utilities" Feb 24 10:13:39 crc kubenswrapper[4985]: I0224 10:13:39.025830 4985 state_mem.go:107] "Deleted CPUSet assignment" podUID="cd5749ab-5662-41d6-9d8b-663bdd9b7a0b" containerName="extract-utilities" Feb 24 10:13:39 crc kubenswrapper[4985]: E0224 10:13:39.025837 4985 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="720477c9-8e44-43cf-a9ba-5ac5b96fe65b" containerName="extract-content" Feb 24 10:13:39 crc kubenswrapper[4985]: I0224 10:13:39.025844 4985 state_mem.go:107] "Deleted CPUSet assignment" podUID="720477c9-8e44-43cf-a9ba-5ac5b96fe65b" containerName="extract-content" Feb 24 10:13:39 crc kubenswrapper[4985]: E0224 10:13:39.025853 4985 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a6f03b61-4740-4538-877d-40390729b5ef" containerName="extract-utilities" Feb 24 10:13:39 crc kubenswrapper[4985]: I0224 10:13:39.025858 4985 state_mem.go:107] "Deleted CPUSet assignment" podUID="a6f03b61-4740-4538-877d-40390729b5ef" containerName="extract-utilities" Feb 24 10:13:39 crc kubenswrapper[4985]: E0224 10:13:39.025865 4985 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="49f9b37d-f90b-49ba-bb8d-bb34255c63c0" containerName="extract-content" Feb 24 10:13:39 crc kubenswrapper[4985]: I0224 10:13:39.025870 4985 state_mem.go:107] "Deleted CPUSet assignment" podUID="49f9b37d-f90b-49ba-bb8d-bb34255c63c0" containerName="extract-content" Feb 24 10:13:39 crc kubenswrapper[4985]: E0224 10:13:39.025878 4985 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="646342de-73ea-4a23-b414-66b824660b5e" containerName="pruner" Feb 24 10:13:39 crc kubenswrapper[4985]: I0224 10:13:39.025884 4985 state_mem.go:107] "Deleted CPUSet assignment" podUID="646342de-73ea-4a23-b414-66b824660b5e" containerName="pruner" Feb 24 10:13:39 crc kubenswrapper[4985]: E0224 10:13:39.025906 4985 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cd5749ab-5662-41d6-9d8b-663bdd9b7a0b" containerName="registry-server" Feb 24 10:13:39 crc kubenswrapper[4985]: I0224 10:13:39.025914 4985 state_mem.go:107] "Deleted CPUSet assignment" podUID="cd5749ab-5662-41d6-9d8b-663bdd9b7a0b" containerName="registry-server" Feb 24 10:13:39 crc kubenswrapper[4985]: E0224 10:13:39.025926 4985 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="720477c9-8e44-43cf-a9ba-5ac5b96fe65b" containerName="registry-server" Feb 24 10:13:39 crc kubenswrapper[4985]: I0224 10:13:39.025932 4985 state_mem.go:107] "Deleted CPUSet assignment" podUID="720477c9-8e44-43cf-a9ba-5ac5b96fe65b" containerName="registry-server" Feb 24 10:13:39 crc kubenswrapper[4985]: I0224 10:13:39.026018 4985 memory_manager.go:354] "RemoveStaleState removing state" podUID="cd5749ab-5662-41d6-9d8b-663bdd9b7a0b" containerName="registry-server" Feb 24 10:13:39 crc kubenswrapper[4985]: I0224 10:13:39.026029 4985 memory_manager.go:354] "RemoveStaleState removing state" podUID="49f9b37d-f90b-49ba-bb8d-bb34255c63c0" containerName="registry-server" Feb 24 10:13:39 crc kubenswrapper[4985]: I0224 10:13:39.026035 4985 memory_manager.go:354] "RemoveStaleState removing state" podUID="646342de-73ea-4a23-b414-66b824660b5e" containerName="pruner" Feb 24 10:13:39 crc kubenswrapper[4985]: I0224 10:13:39.026044 4985 memory_manager.go:354] "RemoveStaleState removing state" podUID="13afe978-9654-487a-9742-cf5e2e5a1e00" containerName="controller-manager" Feb 24 10:13:39 crc kubenswrapper[4985]: I0224 10:13:39.026054 4985 memory_manager.go:354] "RemoveStaleState removing state" podUID="a6f03b61-4740-4538-877d-40390729b5ef" containerName="registry-server" Feb 24 10:13:39 crc kubenswrapper[4985]: I0224 10:13:39.026062 4985 memory_manager.go:354] "RemoveStaleState removing state" podUID="7dfb5a3d-1117-42e6-b4e3-59afe7960139" containerName="route-controller-manager" Feb 24 10:13:39 crc kubenswrapper[4985]: I0224 10:13:39.026070 4985 memory_manager.go:354] "RemoveStaleState removing state" podUID="720477c9-8e44-43cf-a9ba-5ac5b96fe65b" containerName="registry-server" Feb 24 10:13:39 crc kubenswrapper[4985]: I0224 10:13:39.027009 4985 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-b99b87784-mpfxr" Feb 24 10:13:39 crc kubenswrapper[4985]: I0224 10:13:39.029784 4985 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Feb 24 10:13:39 crc kubenswrapper[4985]: I0224 10:13:39.030008 4985 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Feb 24 10:13:39 crc kubenswrapper[4985]: I0224 10:13:39.030051 4985 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Feb 24 10:13:39 crc kubenswrapper[4985]: I0224 10:13:39.030521 4985 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Feb 24 10:13:39 crc kubenswrapper[4985]: I0224 10:13:39.030543 4985 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Feb 24 10:13:39 crc kubenswrapper[4985]: I0224 10:13:39.031792 4985 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Feb 24 10:13:39 crc kubenswrapper[4985]: I0224 10:13:39.034941 4985 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-f847b5bdd-bzh7h"] Feb 24 10:13:39 crc kubenswrapper[4985]: I0224 10:13:39.036329 4985 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-f847b5bdd-bzh7h" Feb 24 10:13:39 crc kubenswrapper[4985]: I0224 10:13:39.037944 4985 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-b99b87784-mpfxr"] Feb 24 10:13:39 crc kubenswrapper[4985]: I0224 10:13:39.044568 4985 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Feb 24 10:13:39 crc kubenswrapper[4985]: I0224 10:13:39.046851 4985 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hmqc6\" (UniqueName: \"kubernetes.io/projected/b4ac1741-e660-4bb7-9685-45cde423045b-kube-api-access-hmqc6\") pod \"route-controller-manager-f847b5bdd-bzh7h\" (UID: \"b4ac1741-e660-4bb7-9685-45cde423045b\") " pod="openshift-route-controller-manager/route-controller-manager-f847b5bdd-bzh7h" Feb 24 10:13:39 crc kubenswrapper[4985]: I0224 10:13:39.046921 4985 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b4ac1741-e660-4bb7-9685-45cde423045b-serving-cert\") pod \"route-controller-manager-f847b5bdd-bzh7h\" (UID: \"b4ac1741-e660-4bb7-9685-45cde423045b\") " pod="openshift-route-controller-manager/route-controller-manager-f847b5bdd-bzh7h" Feb 24 10:13:39 crc kubenswrapper[4985]: I0224 10:13:39.046966 4985 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/60d0c604-71d7-4807-bb1b-c9d19e3e507d-client-ca\") pod \"controller-manager-b99b87784-mpfxr\" (UID: \"60d0c604-71d7-4807-bb1b-c9d19e3e507d\") " pod="openshift-controller-manager/controller-manager-b99b87784-mpfxr" Feb 24 10:13:39 crc kubenswrapper[4985]: I0224 10:13:39.047014 4985 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/60d0c604-71d7-4807-bb1b-c9d19e3e507d-config\") pod \"controller-manager-b99b87784-mpfxr\" (UID: \"60d0c604-71d7-4807-bb1b-c9d19e3e507d\") " pod="openshift-controller-manager/controller-manager-b99b87784-mpfxr" Feb 24 10:13:39 crc kubenswrapper[4985]: I0224 10:13:39.047043 4985 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b4ac1741-e660-4bb7-9685-45cde423045b-config\") pod \"route-controller-manager-f847b5bdd-bzh7h\" (UID: \"b4ac1741-e660-4bb7-9685-45cde423045b\") " pod="openshift-route-controller-manager/route-controller-manager-f847b5bdd-bzh7h" Feb 24 10:13:39 crc kubenswrapper[4985]: I0224 10:13:39.047065 4985 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/60d0c604-71d7-4807-bb1b-c9d19e3e507d-proxy-ca-bundles\") pod \"controller-manager-b99b87784-mpfxr\" (UID: \"60d0c604-71d7-4807-bb1b-c9d19e3e507d\") " pod="openshift-controller-manager/controller-manager-b99b87784-mpfxr" Feb 24 10:13:39 crc kubenswrapper[4985]: I0224 10:13:39.047100 4985 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vlhmv\" (UniqueName: \"kubernetes.io/projected/60d0c604-71d7-4807-bb1b-c9d19e3e507d-kube-api-access-vlhmv\") pod \"controller-manager-b99b87784-mpfxr\" (UID: \"60d0c604-71d7-4807-bb1b-c9d19e3e507d\") " pod="openshift-controller-manager/controller-manager-b99b87784-mpfxr" Feb 24 10:13:39 crc kubenswrapper[4985]: I0224 10:13:39.047125 4985 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/b4ac1741-e660-4bb7-9685-45cde423045b-client-ca\") pod \"route-controller-manager-f847b5bdd-bzh7h\" (UID: \"b4ac1741-e660-4bb7-9685-45cde423045b\") " pod="openshift-route-controller-manager/route-controller-manager-f847b5bdd-bzh7h" Feb 24 10:13:39 crc kubenswrapper[4985]: I0224 10:13:39.047158 4985 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/60d0c604-71d7-4807-bb1b-c9d19e3e507d-serving-cert\") pod \"controller-manager-b99b87784-mpfxr\" (UID: \"60d0c604-71d7-4807-bb1b-c9d19e3e507d\") " pod="openshift-controller-manager/controller-manager-b99b87784-mpfxr" Feb 24 10:13:39 crc kubenswrapper[4985]: I0224 10:13:39.048529 4985 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Feb 24 10:13:39 crc kubenswrapper[4985]: I0224 10:13:39.048679 4985 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Feb 24 10:13:39 crc kubenswrapper[4985]: I0224 10:13:39.049134 4985 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Feb 24 10:13:39 crc kubenswrapper[4985]: I0224 10:13:39.050334 4985 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Feb 24 10:13:39 crc kubenswrapper[4985]: I0224 10:13:39.050479 4985 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Feb 24 10:13:39 crc kubenswrapper[4985]: I0224 10:13:39.050713 4985 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Feb 24 10:13:39 crc kubenswrapper[4985]: I0224 10:13:39.053090 4985 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-f847b5bdd-bzh7h"] Feb 24 10:13:39 crc kubenswrapper[4985]: I0224 10:13:39.115653 4985 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-xh5q8" Feb 24 10:13:39 crc kubenswrapper[4985]: I0224 10:13:39.148954 4985 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/60d0c604-71d7-4807-bb1b-c9d19e3e507d-config\") pod \"controller-manager-b99b87784-mpfxr\" (UID: \"60d0c604-71d7-4807-bb1b-c9d19e3e507d\") " pod="openshift-controller-manager/controller-manager-b99b87784-mpfxr" Feb 24 10:13:39 crc kubenswrapper[4985]: I0224 10:13:39.149003 4985 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b4ac1741-e660-4bb7-9685-45cde423045b-config\") pod \"route-controller-manager-f847b5bdd-bzh7h\" (UID: \"b4ac1741-e660-4bb7-9685-45cde423045b\") " pod="openshift-route-controller-manager/route-controller-manager-f847b5bdd-bzh7h" Feb 24 10:13:39 crc kubenswrapper[4985]: I0224 10:13:39.149070 4985 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/60d0c604-71d7-4807-bb1b-c9d19e3e507d-proxy-ca-bundles\") pod \"controller-manager-b99b87784-mpfxr\" (UID: \"60d0c604-71d7-4807-bb1b-c9d19e3e507d\") " pod="openshift-controller-manager/controller-manager-b99b87784-mpfxr" Feb 24 10:13:39 crc kubenswrapper[4985]: I0224 10:13:39.149132 4985 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vlhmv\" (UniqueName: \"kubernetes.io/projected/60d0c604-71d7-4807-bb1b-c9d19e3e507d-kube-api-access-vlhmv\") pod \"controller-manager-b99b87784-mpfxr\" (UID: \"60d0c604-71d7-4807-bb1b-c9d19e3e507d\") " pod="openshift-controller-manager/controller-manager-b99b87784-mpfxr" Feb 24 10:13:39 crc kubenswrapper[4985]: I0224 10:13:39.149158 4985 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/b4ac1741-e660-4bb7-9685-45cde423045b-client-ca\") pod \"route-controller-manager-f847b5bdd-bzh7h\" (UID: \"b4ac1741-e660-4bb7-9685-45cde423045b\") " pod="openshift-route-controller-manager/route-controller-manager-f847b5bdd-bzh7h" Feb 24 10:13:39 crc kubenswrapper[4985]: I0224 10:13:39.149179 4985 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/60d0c604-71d7-4807-bb1b-c9d19e3e507d-serving-cert\") pod \"controller-manager-b99b87784-mpfxr\" (UID: \"60d0c604-71d7-4807-bb1b-c9d19e3e507d\") " pod="openshift-controller-manager/controller-manager-b99b87784-mpfxr" Feb 24 10:13:39 crc kubenswrapper[4985]: I0224 10:13:39.149201 4985 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hmqc6\" (UniqueName: \"kubernetes.io/projected/b4ac1741-e660-4bb7-9685-45cde423045b-kube-api-access-hmqc6\") pod \"route-controller-manager-f847b5bdd-bzh7h\" (UID: \"b4ac1741-e660-4bb7-9685-45cde423045b\") " pod="openshift-route-controller-manager/route-controller-manager-f847b5bdd-bzh7h" Feb 24 10:13:39 crc kubenswrapper[4985]: I0224 10:13:39.149238 4985 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b4ac1741-e660-4bb7-9685-45cde423045b-serving-cert\") pod \"route-controller-manager-f847b5bdd-bzh7h\" (UID: \"b4ac1741-e660-4bb7-9685-45cde423045b\") " pod="openshift-route-controller-manager/route-controller-manager-f847b5bdd-bzh7h" Feb 24 10:13:39 crc kubenswrapper[4985]: I0224 10:13:39.149274 4985 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/60d0c604-71d7-4807-bb1b-c9d19e3e507d-client-ca\") pod \"controller-manager-b99b87784-mpfxr\" (UID: \"60d0c604-71d7-4807-bb1b-c9d19e3e507d\") " pod="openshift-controller-manager/controller-manager-b99b87784-mpfxr" Feb 24 10:13:39 crc kubenswrapper[4985]: I0224 10:13:39.150000 4985 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/b4ac1741-e660-4bb7-9685-45cde423045b-client-ca\") pod \"route-controller-manager-f847b5bdd-bzh7h\" (UID: \"b4ac1741-e660-4bb7-9685-45cde423045b\") " pod="openshift-route-controller-manager/route-controller-manager-f847b5bdd-bzh7h" Feb 24 10:13:39 crc kubenswrapper[4985]: I0224 10:13:39.150165 4985 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/60d0c604-71d7-4807-bb1b-c9d19e3e507d-client-ca\") pod \"controller-manager-b99b87784-mpfxr\" (UID: \"60d0c604-71d7-4807-bb1b-c9d19e3e507d\") " pod="openshift-controller-manager/controller-manager-b99b87784-mpfxr" Feb 24 10:13:39 crc kubenswrapper[4985]: I0224 10:13:39.150243 4985 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/60d0c604-71d7-4807-bb1b-c9d19e3e507d-proxy-ca-bundles\") pod \"controller-manager-b99b87784-mpfxr\" (UID: \"60d0c604-71d7-4807-bb1b-c9d19e3e507d\") " pod="openshift-controller-manager/controller-manager-b99b87784-mpfxr" Feb 24 10:13:39 crc kubenswrapper[4985]: I0224 10:13:39.150923 4985 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/60d0c604-71d7-4807-bb1b-c9d19e3e507d-config\") pod \"controller-manager-b99b87784-mpfxr\" (UID: \"60d0c604-71d7-4807-bb1b-c9d19e3e507d\") " pod="openshift-controller-manager/controller-manager-b99b87784-mpfxr" Feb 24 10:13:39 crc kubenswrapper[4985]: I0224 10:13:39.152141 4985 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b4ac1741-e660-4bb7-9685-45cde423045b-config\") pod \"route-controller-manager-f847b5bdd-bzh7h\" (UID: \"b4ac1741-e660-4bb7-9685-45cde423045b\") " pod="openshift-route-controller-manager/route-controller-manager-f847b5bdd-bzh7h" Feb 24 10:13:39 crc kubenswrapper[4985]: I0224 10:13:39.153353 4985 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/60d0c604-71d7-4807-bb1b-c9d19e3e507d-serving-cert\") pod \"controller-manager-b99b87784-mpfxr\" (UID: \"60d0c604-71d7-4807-bb1b-c9d19e3e507d\") " pod="openshift-controller-manager/controller-manager-b99b87784-mpfxr" Feb 24 10:13:39 crc kubenswrapper[4985]: I0224 10:13:39.153375 4985 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b4ac1741-e660-4bb7-9685-45cde423045b-serving-cert\") pod \"route-controller-manager-f847b5bdd-bzh7h\" (UID: \"b4ac1741-e660-4bb7-9685-45cde423045b\") " pod="openshift-route-controller-manager/route-controller-manager-f847b5bdd-bzh7h" Feb 24 10:13:39 crc kubenswrapper[4985]: I0224 10:13:39.169149 4985 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hmqc6\" (UniqueName: \"kubernetes.io/projected/b4ac1741-e660-4bb7-9685-45cde423045b-kube-api-access-hmqc6\") pod \"route-controller-manager-f847b5bdd-bzh7h\" (UID: \"b4ac1741-e660-4bb7-9685-45cde423045b\") " pod="openshift-route-controller-manager/route-controller-manager-f847b5bdd-bzh7h" Feb 24 10:13:39 crc kubenswrapper[4985]: I0224 10:13:39.171794 4985 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vlhmv\" (UniqueName: \"kubernetes.io/projected/60d0c604-71d7-4807-bb1b-c9d19e3e507d-kube-api-access-vlhmv\") pod \"controller-manager-b99b87784-mpfxr\" (UID: \"60d0c604-71d7-4807-bb1b-c9d19e3e507d\") " pod="openshift-controller-manager/controller-manager-b99b87784-mpfxr" Feb 24 10:13:39 crc kubenswrapper[4985]: I0224 10:13:39.250806 4985 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/ee57ec8e-3901-4355-b744-5ed2eeb20c9d-v4-0-config-user-idp-0-file-data\") pod \"ee57ec8e-3901-4355-b744-5ed2eeb20c9d\" (UID: \"ee57ec8e-3901-4355-b744-5ed2eeb20c9d\") " Feb 24 10:13:39 crc kubenswrapper[4985]: I0224 10:13:39.250872 4985 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qss5w\" (UniqueName: \"kubernetes.io/projected/ee57ec8e-3901-4355-b744-5ed2eeb20c9d-kube-api-access-qss5w\") pod \"ee57ec8e-3901-4355-b744-5ed2eeb20c9d\" (UID: \"ee57ec8e-3901-4355-b744-5ed2eeb20c9d\") " Feb 24 10:13:39 crc kubenswrapper[4985]: I0224 10:13:39.250929 4985 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/ee57ec8e-3901-4355-b744-5ed2eeb20c9d-v4-0-config-system-serving-cert\") pod \"ee57ec8e-3901-4355-b744-5ed2eeb20c9d\" (UID: \"ee57ec8e-3901-4355-b744-5ed2eeb20c9d\") " Feb 24 10:13:39 crc kubenswrapper[4985]: I0224 10:13:39.250968 4985 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/ee57ec8e-3901-4355-b744-5ed2eeb20c9d-v4-0-config-system-session\") pod \"ee57ec8e-3901-4355-b744-5ed2eeb20c9d\" (UID: \"ee57ec8e-3901-4355-b744-5ed2eeb20c9d\") " Feb 24 10:13:39 crc kubenswrapper[4985]: I0224 10:13:39.251005 4985 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/ee57ec8e-3901-4355-b744-5ed2eeb20c9d-audit-policies\") pod \"ee57ec8e-3901-4355-b744-5ed2eeb20c9d\" (UID: \"ee57ec8e-3901-4355-b744-5ed2eeb20c9d\") " Feb 24 10:13:39 crc kubenswrapper[4985]: I0224 10:13:39.251048 4985 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/ee57ec8e-3901-4355-b744-5ed2eeb20c9d-audit-dir\") pod \"ee57ec8e-3901-4355-b744-5ed2eeb20c9d\" (UID: \"ee57ec8e-3901-4355-b744-5ed2eeb20c9d\") " Feb 24 10:13:39 crc kubenswrapper[4985]: I0224 10:13:39.251097 4985 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/ee57ec8e-3901-4355-b744-5ed2eeb20c9d-v4-0-config-system-service-ca\") pod \"ee57ec8e-3901-4355-b744-5ed2eeb20c9d\" (UID: \"ee57ec8e-3901-4355-b744-5ed2eeb20c9d\") " Feb 24 10:13:39 crc kubenswrapper[4985]: I0224 10:13:39.251128 4985 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/ee57ec8e-3901-4355-b744-5ed2eeb20c9d-v4-0-config-user-template-login\") pod \"ee57ec8e-3901-4355-b744-5ed2eeb20c9d\" (UID: \"ee57ec8e-3901-4355-b744-5ed2eeb20c9d\") " Feb 24 10:13:39 crc kubenswrapper[4985]: I0224 10:13:39.251167 4985 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/ee57ec8e-3901-4355-b744-5ed2eeb20c9d-v4-0-config-system-ocp-branding-template\") pod \"ee57ec8e-3901-4355-b744-5ed2eeb20c9d\" (UID: \"ee57ec8e-3901-4355-b744-5ed2eeb20c9d\") " Feb 24 10:13:39 crc kubenswrapper[4985]: I0224 10:13:39.251255 4985 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ee57ec8e-3901-4355-b744-5ed2eeb20c9d-v4-0-config-system-trusted-ca-bundle\") pod \"ee57ec8e-3901-4355-b744-5ed2eeb20c9d\" (UID: \"ee57ec8e-3901-4355-b744-5ed2eeb20c9d\") " Feb 24 10:13:39 crc kubenswrapper[4985]: I0224 10:13:39.251285 4985 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/ee57ec8e-3901-4355-b744-5ed2eeb20c9d-v4-0-config-system-cliconfig\") pod \"ee57ec8e-3901-4355-b744-5ed2eeb20c9d\" (UID: \"ee57ec8e-3901-4355-b744-5ed2eeb20c9d\") " Feb 24 10:13:39 crc kubenswrapper[4985]: I0224 10:13:39.251336 4985 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/ee57ec8e-3901-4355-b744-5ed2eeb20c9d-v4-0-config-user-template-error\") pod \"ee57ec8e-3901-4355-b744-5ed2eeb20c9d\" (UID: \"ee57ec8e-3901-4355-b744-5ed2eeb20c9d\") " Feb 24 10:13:39 crc kubenswrapper[4985]: I0224 10:13:39.251362 4985 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/ee57ec8e-3901-4355-b744-5ed2eeb20c9d-v4-0-config-system-router-certs\") pod \"ee57ec8e-3901-4355-b744-5ed2eeb20c9d\" (UID: \"ee57ec8e-3901-4355-b744-5ed2eeb20c9d\") " Feb 24 10:13:39 crc kubenswrapper[4985]: I0224 10:13:39.251399 4985 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/ee57ec8e-3901-4355-b744-5ed2eeb20c9d-v4-0-config-user-template-provider-selection\") pod \"ee57ec8e-3901-4355-b744-5ed2eeb20c9d\" (UID: \"ee57ec8e-3901-4355-b744-5ed2eeb20c9d\") " Feb 24 10:13:39 crc kubenswrapper[4985]: I0224 10:13:39.252093 4985 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ee57ec8e-3901-4355-b744-5ed2eeb20c9d-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "ee57ec8e-3901-4355-b744-5ed2eeb20c9d" (UID: "ee57ec8e-3901-4355-b744-5ed2eeb20c9d"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 24 10:13:39 crc kubenswrapper[4985]: I0224 10:13:39.252630 4985 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ee57ec8e-3901-4355-b744-5ed2eeb20c9d-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "ee57ec8e-3901-4355-b744-5ed2eeb20c9d" (UID: "ee57ec8e-3901-4355-b744-5ed2eeb20c9d"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 24 10:13:39 crc kubenswrapper[4985]: I0224 10:13:39.252759 4985 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ee57ec8e-3901-4355-b744-5ed2eeb20c9d-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "ee57ec8e-3901-4355-b744-5ed2eeb20c9d" (UID: "ee57ec8e-3901-4355-b744-5ed2eeb20c9d"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 24 10:13:39 crc kubenswrapper[4985]: I0224 10:13:39.253444 4985 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ee57ec8e-3901-4355-b744-5ed2eeb20c9d-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "ee57ec8e-3901-4355-b744-5ed2eeb20c9d" (UID: "ee57ec8e-3901-4355-b744-5ed2eeb20c9d"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 24 10:13:39 crc kubenswrapper[4985]: I0224 10:13:39.253555 4985 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ee57ec8e-3901-4355-b744-5ed2eeb20c9d-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "ee57ec8e-3901-4355-b744-5ed2eeb20c9d" (UID: "ee57ec8e-3901-4355-b744-5ed2eeb20c9d"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 24 10:13:39 crc kubenswrapper[4985]: I0224 10:13:39.254266 4985 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ee57ec8e-3901-4355-b744-5ed2eeb20c9d-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "ee57ec8e-3901-4355-b744-5ed2eeb20c9d" (UID: "ee57ec8e-3901-4355-b744-5ed2eeb20c9d"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 24 10:13:39 crc kubenswrapper[4985]: I0224 10:13:39.254521 4985 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ee57ec8e-3901-4355-b744-5ed2eeb20c9d-kube-api-access-qss5w" (OuterVolumeSpecName: "kube-api-access-qss5w") pod "ee57ec8e-3901-4355-b744-5ed2eeb20c9d" (UID: "ee57ec8e-3901-4355-b744-5ed2eeb20c9d"). InnerVolumeSpecName "kube-api-access-qss5w". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 24 10:13:39 crc kubenswrapper[4985]: I0224 10:13:39.254569 4985 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ee57ec8e-3901-4355-b744-5ed2eeb20c9d-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "ee57ec8e-3901-4355-b744-5ed2eeb20c9d" (UID: "ee57ec8e-3901-4355-b744-5ed2eeb20c9d"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 24 10:13:39 crc kubenswrapper[4985]: I0224 10:13:39.255006 4985 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ee57ec8e-3901-4355-b744-5ed2eeb20c9d-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "ee57ec8e-3901-4355-b744-5ed2eeb20c9d" (UID: "ee57ec8e-3901-4355-b744-5ed2eeb20c9d"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 24 10:13:39 crc kubenswrapper[4985]: I0224 10:13:39.255858 4985 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ee57ec8e-3901-4355-b744-5ed2eeb20c9d-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "ee57ec8e-3901-4355-b744-5ed2eeb20c9d" (UID: "ee57ec8e-3901-4355-b744-5ed2eeb20c9d"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 24 10:13:39 crc kubenswrapper[4985]: I0224 10:13:39.256227 4985 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ee57ec8e-3901-4355-b744-5ed2eeb20c9d-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "ee57ec8e-3901-4355-b744-5ed2eeb20c9d" (UID: "ee57ec8e-3901-4355-b744-5ed2eeb20c9d"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 24 10:13:39 crc kubenswrapper[4985]: I0224 10:13:39.256523 4985 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ee57ec8e-3901-4355-b744-5ed2eeb20c9d-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "ee57ec8e-3901-4355-b744-5ed2eeb20c9d" (UID: "ee57ec8e-3901-4355-b744-5ed2eeb20c9d"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 24 10:13:39 crc kubenswrapper[4985]: I0224 10:13:39.256743 4985 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ee57ec8e-3901-4355-b744-5ed2eeb20c9d-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "ee57ec8e-3901-4355-b744-5ed2eeb20c9d" (UID: "ee57ec8e-3901-4355-b744-5ed2eeb20c9d"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 24 10:13:39 crc kubenswrapper[4985]: I0224 10:13:39.257817 4985 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ee57ec8e-3901-4355-b744-5ed2eeb20c9d-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "ee57ec8e-3901-4355-b744-5ed2eeb20c9d" (UID: "ee57ec8e-3901-4355-b744-5ed2eeb20c9d"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 24 10:13:39 crc kubenswrapper[4985]: I0224 10:13:39.273602 4985 generic.go:334] "Generic (PLEG): container finished" podID="ee57ec8e-3901-4355-b744-5ed2eeb20c9d" containerID="93977d6aab2f8e5fb39fed069172045bef4276f3311667c0c21484be2d8e881a" exitCode=0 Feb 24 10:13:39 crc kubenswrapper[4985]: I0224 10:13:39.273667 4985 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-xh5q8" Feb 24 10:13:39 crc kubenswrapper[4985]: I0224 10:13:39.273669 4985 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-xh5q8" event={"ID":"ee57ec8e-3901-4355-b744-5ed2eeb20c9d","Type":"ContainerDied","Data":"93977d6aab2f8e5fb39fed069172045bef4276f3311667c0c21484be2d8e881a"} Feb 24 10:13:39 crc kubenswrapper[4985]: I0224 10:13:39.273824 4985 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-xh5q8" event={"ID":"ee57ec8e-3901-4355-b744-5ed2eeb20c9d","Type":"ContainerDied","Data":"e5ee134f2567dda87c8478f23813f6e237d4049cb0519e1d99dd910b58b687bf"} Feb 24 10:13:39 crc kubenswrapper[4985]: I0224 10:13:39.273867 4985 scope.go:117] "RemoveContainer" containerID="93977d6aab2f8e5fb39fed069172045bef4276f3311667c0c21484be2d8e881a" Feb 24 10:13:39 crc kubenswrapper[4985]: I0224 10:13:39.311704 4985 scope.go:117] "RemoveContainer" containerID="93977d6aab2f8e5fb39fed069172045bef4276f3311667c0c21484be2d8e881a" Feb 24 10:13:39 crc kubenswrapper[4985]: E0224 10:13:39.312314 4985 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"93977d6aab2f8e5fb39fed069172045bef4276f3311667c0c21484be2d8e881a\": container with ID starting with 93977d6aab2f8e5fb39fed069172045bef4276f3311667c0c21484be2d8e881a not found: ID does not exist" containerID="93977d6aab2f8e5fb39fed069172045bef4276f3311667c0c21484be2d8e881a" Feb 24 10:13:39 crc kubenswrapper[4985]: I0224 10:13:39.312437 4985 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"93977d6aab2f8e5fb39fed069172045bef4276f3311667c0c21484be2d8e881a"} err="failed to get container status \"93977d6aab2f8e5fb39fed069172045bef4276f3311667c0c21484be2d8e881a\": rpc error: code = NotFound desc = could not find container \"93977d6aab2f8e5fb39fed069172045bef4276f3311667c0c21484be2d8e881a\": container with ID starting with 93977d6aab2f8e5fb39fed069172045bef4276f3311667c0c21484be2d8e881a not found: ID does not exist" Feb 24 10:13:39 crc kubenswrapper[4985]: I0224 10:13:39.319172 4985 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-xh5q8"] Feb 24 10:13:39 crc kubenswrapper[4985]: I0224 10:13:39.326080 4985 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-xh5q8"] Feb 24 10:13:39 crc kubenswrapper[4985]: I0224 10:13:39.349747 4985 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-b99b87784-mpfxr" Feb 24 10:13:39 crc kubenswrapper[4985]: I0224 10:13:39.352764 4985 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/ee57ec8e-3901-4355-b744-5ed2eeb20c9d-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Feb 24 10:13:39 crc kubenswrapper[4985]: I0224 10:13:39.352787 4985 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qss5w\" (UniqueName: \"kubernetes.io/projected/ee57ec8e-3901-4355-b744-5ed2eeb20c9d-kube-api-access-qss5w\") on node \"crc\" DevicePath \"\"" Feb 24 10:13:39 crc kubenswrapper[4985]: I0224 10:13:39.352798 4985 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/ee57ec8e-3901-4355-b744-5ed2eeb20c9d-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 24 10:13:39 crc kubenswrapper[4985]: I0224 10:13:39.352808 4985 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/ee57ec8e-3901-4355-b744-5ed2eeb20c9d-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Feb 24 10:13:39 crc kubenswrapper[4985]: I0224 10:13:39.352819 4985 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/ee57ec8e-3901-4355-b744-5ed2eeb20c9d-audit-policies\") on node \"crc\" DevicePath \"\"" Feb 24 10:13:39 crc kubenswrapper[4985]: I0224 10:13:39.352826 4985 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/ee57ec8e-3901-4355-b744-5ed2eeb20c9d-audit-dir\") on node \"crc\" DevicePath \"\"" Feb 24 10:13:39 crc kubenswrapper[4985]: I0224 10:13:39.352837 4985 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/ee57ec8e-3901-4355-b744-5ed2eeb20c9d-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Feb 24 10:13:39 crc kubenswrapper[4985]: I0224 10:13:39.352848 4985 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/ee57ec8e-3901-4355-b744-5ed2eeb20c9d-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Feb 24 10:13:39 crc kubenswrapper[4985]: I0224 10:13:39.352858 4985 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/ee57ec8e-3901-4355-b744-5ed2eeb20c9d-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Feb 24 10:13:39 crc kubenswrapper[4985]: I0224 10:13:39.352868 4985 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ee57ec8e-3901-4355-b744-5ed2eeb20c9d-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 24 10:13:39 crc kubenswrapper[4985]: I0224 10:13:39.352877 4985 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/ee57ec8e-3901-4355-b744-5ed2eeb20c9d-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Feb 24 10:13:39 crc kubenswrapper[4985]: I0224 10:13:39.352914 4985 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/ee57ec8e-3901-4355-b744-5ed2eeb20c9d-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Feb 24 10:13:39 crc kubenswrapper[4985]: I0224 10:13:39.352926 4985 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/ee57ec8e-3901-4355-b744-5ed2eeb20c9d-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Feb 24 10:13:39 crc kubenswrapper[4985]: I0224 10:13:39.352936 4985 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/ee57ec8e-3901-4355-b744-5ed2eeb20c9d-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Feb 24 10:13:39 crc kubenswrapper[4985]: I0224 10:13:39.364735 4985 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-f847b5bdd-bzh7h" Feb 24 10:13:39 crc kubenswrapper[4985]: I0224 10:13:39.529357 4985 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-b99b87784-mpfxr"] Feb 24 10:13:39 crc kubenswrapper[4985]: W0224 10:13:39.552061 4985 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod60d0c604_71d7_4807_bb1b_c9d19e3e507d.slice/crio-c9cd4685b2561d4bfab8e556421818f31e0ec4ea574f818a015848d9a5d3c51c WatchSource:0}: Error finding container c9cd4685b2561d4bfab8e556421818f31e0ec4ea574f818a015848d9a5d3c51c: Status 404 returned error can't find the container with id c9cd4685b2561d4bfab8e556421818f31e0ec4ea574f818a015848d9a5d3c51c Feb 24 10:13:39 crc kubenswrapper[4985]: I0224 10:13:39.775185 4985 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-f847b5bdd-bzh7h"] Feb 24 10:13:39 crc kubenswrapper[4985]: W0224 10:13:39.780489 4985 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb4ac1741_e660_4bb7_9685_45cde423045b.slice/crio-a20712e2b47d6141332b8ec00cd68c9e645e665f4c7037c466b4082036dc0777 WatchSource:0}: Error finding container a20712e2b47d6141332b8ec00cd68c9e645e665f4c7037c466b4082036dc0777: Status 404 returned error can't find the container with id a20712e2b47d6141332b8ec00cd68c9e645e665f4c7037c466b4082036dc0777 Feb 24 10:13:40 crc kubenswrapper[4985]: I0224 10:13:40.269758 4985 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="13afe978-9654-487a-9742-cf5e2e5a1e00" path="/var/lib/kubelet/pods/13afe978-9654-487a-9742-cf5e2e5a1e00/volumes" Feb 24 10:13:40 crc kubenswrapper[4985]: I0224 10:13:40.270421 4985 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a6f03b61-4740-4538-877d-40390729b5ef" path="/var/lib/kubelet/pods/a6f03b61-4740-4538-877d-40390729b5ef/volumes" Feb 24 10:13:40 crc kubenswrapper[4985]: I0224 10:13:40.271139 4985 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ee57ec8e-3901-4355-b744-5ed2eeb20c9d" path="/var/lib/kubelet/pods/ee57ec8e-3901-4355-b744-5ed2eeb20c9d/volumes" Feb 24 10:13:40 crc kubenswrapper[4985]: I0224 10:13:40.283252 4985 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-b99b87784-mpfxr" event={"ID":"60d0c604-71d7-4807-bb1b-c9d19e3e507d","Type":"ContainerStarted","Data":"02756bf4947fdd483d5f2b687f4181cba6bb117342081a30a7a16fbeb5fbabc1"} Feb 24 10:13:40 crc kubenswrapper[4985]: I0224 10:13:40.283308 4985 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-b99b87784-mpfxr" event={"ID":"60d0c604-71d7-4807-bb1b-c9d19e3e507d","Type":"ContainerStarted","Data":"c9cd4685b2561d4bfab8e556421818f31e0ec4ea574f818a015848d9a5d3c51c"} Feb 24 10:13:40 crc kubenswrapper[4985]: I0224 10:13:40.283580 4985 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-b99b87784-mpfxr" Feb 24 10:13:40 crc kubenswrapper[4985]: I0224 10:13:40.286939 4985 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-f847b5bdd-bzh7h" event={"ID":"b4ac1741-e660-4bb7-9685-45cde423045b","Type":"ContainerStarted","Data":"122448aa64cf6196ff489a05c84b8a05a400071de4c9e24f1d6964159cd120ee"} Feb 24 10:13:40 crc kubenswrapper[4985]: I0224 10:13:40.286975 4985 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-f847b5bdd-bzh7h" event={"ID":"b4ac1741-e660-4bb7-9685-45cde423045b","Type":"ContainerStarted","Data":"a20712e2b47d6141332b8ec00cd68c9e645e665f4c7037c466b4082036dc0777"} Feb 24 10:13:40 crc kubenswrapper[4985]: I0224 10:13:40.287163 4985 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-f847b5bdd-bzh7h" Feb 24 10:13:40 crc kubenswrapper[4985]: I0224 10:13:40.288485 4985 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-b99b87784-mpfxr" Feb 24 10:13:40 crc kubenswrapper[4985]: I0224 10:13:40.303768 4985 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-b99b87784-mpfxr" podStartSLOduration=5.303747067 podStartE2EDuration="5.303747067s" podCreationTimestamp="2026-02-24 10:13:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-24 10:13:40.301279456 +0000 UTC m=+304.775472026" watchObservedRunningTime="2026-02-24 10:13:40.303747067 +0000 UTC m=+304.777939627" Feb 24 10:13:40 crc kubenswrapper[4985]: I0224 10:13:40.346517 4985 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-f847b5bdd-bzh7h" podStartSLOduration=5.34649307 podStartE2EDuration="5.34649307s" podCreationTimestamp="2026-02-24 10:13:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-24 10:13:40.34439253 +0000 UTC m=+304.818585110" watchObservedRunningTime="2026-02-24 10:13:40.34649307 +0000 UTC m=+304.820685640" Feb 24 10:13:40 crc kubenswrapper[4985]: I0224 10:13:40.411975 4985 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-f847b5bdd-bzh7h" Feb 24 10:13:41 crc kubenswrapper[4985]: I0224 10:13:41.029756 4985 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-7444bbcf85-bqpnk"] Feb 24 10:13:41 crc kubenswrapper[4985]: E0224 10:13:41.030225 4985 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ee57ec8e-3901-4355-b744-5ed2eeb20c9d" containerName="oauth-openshift" Feb 24 10:13:41 crc kubenswrapper[4985]: I0224 10:13:41.030246 4985 state_mem.go:107] "Deleted CPUSet assignment" podUID="ee57ec8e-3901-4355-b744-5ed2eeb20c9d" containerName="oauth-openshift" Feb 24 10:13:41 crc kubenswrapper[4985]: I0224 10:13:41.030508 4985 memory_manager.go:354] "RemoveStaleState removing state" podUID="ee57ec8e-3901-4355-b744-5ed2eeb20c9d" containerName="oauth-openshift" Feb 24 10:13:41 crc kubenswrapper[4985]: I0224 10:13:41.032796 4985 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-7444bbcf85-bqpnk" Feb 24 10:13:41 crc kubenswrapper[4985]: I0224 10:13:41.037378 4985 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Feb 24 10:13:41 crc kubenswrapper[4985]: I0224 10:13:41.037599 4985 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Feb 24 10:13:41 crc kubenswrapper[4985]: I0224 10:13:41.038117 4985 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Feb 24 10:13:41 crc kubenswrapper[4985]: I0224 10:13:41.038264 4985 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Feb 24 10:13:41 crc kubenswrapper[4985]: I0224 10:13:41.039186 4985 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Feb 24 10:13:41 crc kubenswrapper[4985]: I0224 10:13:41.039506 4985 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Feb 24 10:13:41 crc kubenswrapper[4985]: I0224 10:13:41.039561 4985 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Feb 24 10:13:41 crc kubenswrapper[4985]: I0224 10:13:41.039655 4985 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Feb 24 10:13:41 crc kubenswrapper[4985]: I0224 10:13:41.047980 4985 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Feb 24 10:13:41 crc kubenswrapper[4985]: I0224 10:13:41.048388 4985 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Feb 24 10:13:41 crc kubenswrapper[4985]: I0224 10:13:41.048484 4985 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Feb 24 10:13:41 crc kubenswrapper[4985]: I0224 10:13:41.051360 4985 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Feb 24 10:13:41 crc kubenswrapper[4985]: I0224 10:13:41.055800 4985 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-7444bbcf85-bqpnk"] Feb 24 10:13:41 crc kubenswrapper[4985]: I0224 10:13:41.057645 4985 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Feb 24 10:13:41 crc kubenswrapper[4985]: I0224 10:13:41.068872 4985 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Feb 24 10:13:41 crc kubenswrapper[4985]: I0224 10:13:41.070204 4985 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Feb 24 10:13:41 crc kubenswrapper[4985]: I0224 10:13:41.177207 4985 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/3b99473d-c293-4701-b7b3-81d6fa32c0d0-audit-dir\") pod \"oauth-openshift-7444bbcf85-bqpnk\" (UID: \"3b99473d-c293-4701-b7b3-81d6fa32c0d0\") " pod="openshift-authentication/oauth-openshift-7444bbcf85-bqpnk" Feb 24 10:13:41 crc kubenswrapper[4985]: I0224 10:13:41.177259 4985 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3b99473d-c293-4701-b7b3-81d6fa32c0d0-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-7444bbcf85-bqpnk\" (UID: \"3b99473d-c293-4701-b7b3-81d6fa32c0d0\") " pod="openshift-authentication/oauth-openshift-7444bbcf85-bqpnk" Feb 24 10:13:41 crc kubenswrapper[4985]: I0224 10:13:41.177296 4985 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/3b99473d-c293-4701-b7b3-81d6fa32c0d0-v4-0-config-system-serving-cert\") pod \"oauth-openshift-7444bbcf85-bqpnk\" (UID: \"3b99473d-c293-4701-b7b3-81d6fa32c0d0\") " pod="openshift-authentication/oauth-openshift-7444bbcf85-bqpnk" Feb 24 10:13:41 crc kubenswrapper[4985]: I0224 10:13:41.177418 4985 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/3b99473d-c293-4701-b7b3-81d6fa32c0d0-v4-0-config-user-template-error\") pod \"oauth-openshift-7444bbcf85-bqpnk\" (UID: \"3b99473d-c293-4701-b7b3-81d6fa32c0d0\") " pod="openshift-authentication/oauth-openshift-7444bbcf85-bqpnk" Feb 24 10:13:41 crc kubenswrapper[4985]: I0224 10:13:41.177474 4985 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/3b99473d-c293-4701-b7b3-81d6fa32c0d0-v4-0-config-system-router-certs\") pod \"oauth-openshift-7444bbcf85-bqpnk\" (UID: \"3b99473d-c293-4701-b7b3-81d6fa32c0d0\") " pod="openshift-authentication/oauth-openshift-7444bbcf85-bqpnk" Feb 24 10:13:41 crc kubenswrapper[4985]: I0224 10:13:41.177516 4985 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/3b99473d-c293-4701-b7b3-81d6fa32c0d0-v4-0-config-system-service-ca\") pod \"oauth-openshift-7444bbcf85-bqpnk\" (UID: \"3b99473d-c293-4701-b7b3-81d6fa32c0d0\") " pod="openshift-authentication/oauth-openshift-7444bbcf85-bqpnk" Feb 24 10:13:41 crc kubenswrapper[4985]: I0224 10:13:41.177544 4985 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/3b99473d-c293-4701-b7b3-81d6fa32c0d0-v4-0-config-system-cliconfig\") pod \"oauth-openshift-7444bbcf85-bqpnk\" (UID: \"3b99473d-c293-4701-b7b3-81d6fa32c0d0\") " pod="openshift-authentication/oauth-openshift-7444bbcf85-bqpnk" Feb 24 10:13:41 crc kubenswrapper[4985]: I0224 10:13:41.177593 4985 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c5whm\" (UniqueName: \"kubernetes.io/projected/3b99473d-c293-4701-b7b3-81d6fa32c0d0-kube-api-access-c5whm\") pod \"oauth-openshift-7444bbcf85-bqpnk\" (UID: \"3b99473d-c293-4701-b7b3-81d6fa32c0d0\") " pod="openshift-authentication/oauth-openshift-7444bbcf85-bqpnk" Feb 24 10:13:41 crc kubenswrapper[4985]: I0224 10:13:41.177647 4985 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/3b99473d-c293-4701-b7b3-81d6fa32c0d0-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-7444bbcf85-bqpnk\" (UID: \"3b99473d-c293-4701-b7b3-81d6fa32c0d0\") " pod="openshift-authentication/oauth-openshift-7444bbcf85-bqpnk" Feb 24 10:13:41 crc kubenswrapper[4985]: I0224 10:13:41.177683 4985 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/3b99473d-c293-4701-b7b3-81d6fa32c0d0-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-7444bbcf85-bqpnk\" (UID: \"3b99473d-c293-4701-b7b3-81d6fa32c0d0\") " pod="openshift-authentication/oauth-openshift-7444bbcf85-bqpnk" Feb 24 10:13:41 crc kubenswrapper[4985]: I0224 10:13:41.178147 4985 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/3b99473d-c293-4701-b7b3-81d6fa32c0d0-v4-0-config-system-session\") pod \"oauth-openshift-7444bbcf85-bqpnk\" (UID: \"3b99473d-c293-4701-b7b3-81d6fa32c0d0\") " pod="openshift-authentication/oauth-openshift-7444bbcf85-bqpnk" Feb 24 10:13:41 crc kubenswrapper[4985]: I0224 10:13:41.178250 4985 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/3b99473d-c293-4701-b7b3-81d6fa32c0d0-audit-policies\") pod \"oauth-openshift-7444bbcf85-bqpnk\" (UID: \"3b99473d-c293-4701-b7b3-81d6fa32c0d0\") " pod="openshift-authentication/oauth-openshift-7444bbcf85-bqpnk" Feb 24 10:13:41 crc kubenswrapper[4985]: I0224 10:13:41.178333 4985 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/3b99473d-c293-4701-b7b3-81d6fa32c0d0-v4-0-config-user-template-login\") pod \"oauth-openshift-7444bbcf85-bqpnk\" (UID: \"3b99473d-c293-4701-b7b3-81d6fa32c0d0\") " pod="openshift-authentication/oauth-openshift-7444bbcf85-bqpnk" Feb 24 10:13:41 crc kubenswrapper[4985]: I0224 10:13:41.178420 4985 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/3b99473d-c293-4701-b7b3-81d6fa32c0d0-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-7444bbcf85-bqpnk\" (UID: \"3b99473d-c293-4701-b7b3-81d6fa32c0d0\") " pod="openshift-authentication/oauth-openshift-7444bbcf85-bqpnk" Feb 24 10:13:41 crc kubenswrapper[4985]: I0224 10:13:41.279773 4985 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/3b99473d-c293-4701-b7b3-81d6fa32c0d0-audit-dir\") pod \"oauth-openshift-7444bbcf85-bqpnk\" (UID: \"3b99473d-c293-4701-b7b3-81d6fa32c0d0\") " pod="openshift-authentication/oauth-openshift-7444bbcf85-bqpnk" Feb 24 10:13:41 crc kubenswrapper[4985]: I0224 10:13:41.279836 4985 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3b99473d-c293-4701-b7b3-81d6fa32c0d0-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-7444bbcf85-bqpnk\" (UID: \"3b99473d-c293-4701-b7b3-81d6fa32c0d0\") " pod="openshift-authentication/oauth-openshift-7444bbcf85-bqpnk" Feb 24 10:13:41 crc kubenswrapper[4985]: I0224 10:13:41.279904 4985 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/3b99473d-c293-4701-b7b3-81d6fa32c0d0-v4-0-config-system-serving-cert\") pod \"oauth-openshift-7444bbcf85-bqpnk\" (UID: \"3b99473d-c293-4701-b7b3-81d6fa32c0d0\") " pod="openshift-authentication/oauth-openshift-7444bbcf85-bqpnk" Feb 24 10:13:41 crc kubenswrapper[4985]: I0224 10:13:41.279903 4985 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/3b99473d-c293-4701-b7b3-81d6fa32c0d0-audit-dir\") pod \"oauth-openshift-7444bbcf85-bqpnk\" (UID: \"3b99473d-c293-4701-b7b3-81d6fa32c0d0\") " pod="openshift-authentication/oauth-openshift-7444bbcf85-bqpnk" Feb 24 10:13:41 crc kubenswrapper[4985]: I0224 10:13:41.279987 4985 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/3b99473d-c293-4701-b7b3-81d6fa32c0d0-v4-0-config-user-template-error\") pod \"oauth-openshift-7444bbcf85-bqpnk\" (UID: \"3b99473d-c293-4701-b7b3-81d6fa32c0d0\") " pod="openshift-authentication/oauth-openshift-7444bbcf85-bqpnk" Feb 24 10:13:41 crc kubenswrapper[4985]: I0224 10:13:41.280017 4985 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/3b99473d-c293-4701-b7b3-81d6fa32c0d0-v4-0-config-system-router-certs\") pod \"oauth-openshift-7444bbcf85-bqpnk\" (UID: \"3b99473d-c293-4701-b7b3-81d6fa32c0d0\") " pod="openshift-authentication/oauth-openshift-7444bbcf85-bqpnk" Feb 24 10:13:41 crc kubenswrapper[4985]: I0224 10:13:41.280050 4985 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/3b99473d-c293-4701-b7b3-81d6fa32c0d0-v4-0-config-system-service-ca\") pod \"oauth-openshift-7444bbcf85-bqpnk\" (UID: \"3b99473d-c293-4701-b7b3-81d6fa32c0d0\") " pod="openshift-authentication/oauth-openshift-7444bbcf85-bqpnk" Feb 24 10:13:41 crc kubenswrapper[4985]: I0224 10:13:41.280071 4985 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/3b99473d-c293-4701-b7b3-81d6fa32c0d0-v4-0-config-system-cliconfig\") pod \"oauth-openshift-7444bbcf85-bqpnk\" (UID: \"3b99473d-c293-4701-b7b3-81d6fa32c0d0\") " pod="openshift-authentication/oauth-openshift-7444bbcf85-bqpnk" Feb 24 10:13:41 crc kubenswrapper[4985]: I0224 10:13:41.280118 4985 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c5whm\" (UniqueName: \"kubernetes.io/projected/3b99473d-c293-4701-b7b3-81d6fa32c0d0-kube-api-access-c5whm\") pod \"oauth-openshift-7444bbcf85-bqpnk\" (UID: \"3b99473d-c293-4701-b7b3-81d6fa32c0d0\") " pod="openshift-authentication/oauth-openshift-7444bbcf85-bqpnk" Feb 24 10:13:41 crc kubenswrapper[4985]: I0224 10:13:41.280185 4985 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/3b99473d-c293-4701-b7b3-81d6fa32c0d0-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-7444bbcf85-bqpnk\" (UID: \"3b99473d-c293-4701-b7b3-81d6fa32c0d0\") " pod="openshift-authentication/oauth-openshift-7444bbcf85-bqpnk" Feb 24 10:13:41 crc kubenswrapper[4985]: I0224 10:13:41.280230 4985 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/3b99473d-c293-4701-b7b3-81d6fa32c0d0-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-7444bbcf85-bqpnk\" (UID: \"3b99473d-c293-4701-b7b3-81d6fa32c0d0\") " pod="openshift-authentication/oauth-openshift-7444bbcf85-bqpnk" Feb 24 10:13:41 crc kubenswrapper[4985]: I0224 10:13:41.280260 4985 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/3b99473d-c293-4701-b7b3-81d6fa32c0d0-v4-0-config-system-session\") pod \"oauth-openshift-7444bbcf85-bqpnk\" (UID: \"3b99473d-c293-4701-b7b3-81d6fa32c0d0\") " pod="openshift-authentication/oauth-openshift-7444bbcf85-bqpnk" Feb 24 10:13:41 crc kubenswrapper[4985]: I0224 10:13:41.280283 4985 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/3b99473d-c293-4701-b7b3-81d6fa32c0d0-audit-policies\") pod \"oauth-openshift-7444bbcf85-bqpnk\" (UID: \"3b99473d-c293-4701-b7b3-81d6fa32c0d0\") " pod="openshift-authentication/oauth-openshift-7444bbcf85-bqpnk" Feb 24 10:13:41 crc kubenswrapper[4985]: I0224 10:13:41.280312 4985 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/3b99473d-c293-4701-b7b3-81d6fa32c0d0-v4-0-config-user-template-login\") pod \"oauth-openshift-7444bbcf85-bqpnk\" (UID: \"3b99473d-c293-4701-b7b3-81d6fa32c0d0\") " pod="openshift-authentication/oauth-openshift-7444bbcf85-bqpnk" Feb 24 10:13:41 crc kubenswrapper[4985]: I0224 10:13:41.280334 4985 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/3b99473d-c293-4701-b7b3-81d6fa32c0d0-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-7444bbcf85-bqpnk\" (UID: \"3b99473d-c293-4701-b7b3-81d6fa32c0d0\") " pod="openshift-authentication/oauth-openshift-7444bbcf85-bqpnk" Feb 24 10:13:41 crc kubenswrapper[4985]: I0224 10:13:41.281455 4985 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3b99473d-c293-4701-b7b3-81d6fa32c0d0-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-7444bbcf85-bqpnk\" (UID: \"3b99473d-c293-4701-b7b3-81d6fa32c0d0\") " pod="openshift-authentication/oauth-openshift-7444bbcf85-bqpnk" Feb 24 10:13:41 crc kubenswrapper[4985]: I0224 10:13:41.282194 4985 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/3b99473d-c293-4701-b7b3-81d6fa32c0d0-v4-0-config-system-cliconfig\") pod \"oauth-openshift-7444bbcf85-bqpnk\" (UID: \"3b99473d-c293-4701-b7b3-81d6fa32c0d0\") " pod="openshift-authentication/oauth-openshift-7444bbcf85-bqpnk" Feb 24 10:13:41 crc kubenswrapper[4985]: I0224 10:13:41.282328 4985 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/3b99473d-c293-4701-b7b3-81d6fa32c0d0-audit-policies\") pod \"oauth-openshift-7444bbcf85-bqpnk\" (UID: \"3b99473d-c293-4701-b7b3-81d6fa32c0d0\") " pod="openshift-authentication/oauth-openshift-7444bbcf85-bqpnk" Feb 24 10:13:41 crc kubenswrapper[4985]: I0224 10:13:41.282530 4985 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/3b99473d-c293-4701-b7b3-81d6fa32c0d0-v4-0-config-system-service-ca\") pod \"oauth-openshift-7444bbcf85-bqpnk\" (UID: \"3b99473d-c293-4701-b7b3-81d6fa32c0d0\") " pod="openshift-authentication/oauth-openshift-7444bbcf85-bqpnk" Feb 24 10:13:41 crc kubenswrapper[4985]: I0224 10:13:41.286010 4985 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/3b99473d-c293-4701-b7b3-81d6fa32c0d0-v4-0-config-system-session\") pod \"oauth-openshift-7444bbcf85-bqpnk\" (UID: \"3b99473d-c293-4701-b7b3-81d6fa32c0d0\") " pod="openshift-authentication/oauth-openshift-7444bbcf85-bqpnk" Feb 24 10:13:41 crc kubenswrapper[4985]: I0224 10:13:41.287407 4985 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/3b99473d-c293-4701-b7b3-81d6fa32c0d0-v4-0-config-system-router-certs\") pod \"oauth-openshift-7444bbcf85-bqpnk\" (UID: \"3b99473d-c293-4701-b7b3-81d6fa32c0d0\") " pod="openshift-authentication/oauth-openshift-7444bbcf85-bqpnk" Feb 24 10:13:41 crc kubenswrapper[4985]: I0224 10:13:41.288177 4985 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/3b99473d-c293-4701-b7b3-81d6fa32c0d0-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-7444bbcf85-bqpnk\" (UID: \"3b99473d-c293-4701-b7b3-81d6fa32c0d0\") " pod="openshift-authentication/oauth-openshift-7444bbcf85-bqpnk" Feb 24 10:13:41 crc kubenswrapper[4985]: I0224 10:13:41.293317 4985 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/3b99473d-c293-4701-b7b3-81d6fa32c0d0-v4-0-config-user-template-error\") pod \"oauth-openshift-7444bbcf85-bqpnk\" (UID: \"3b99473d-c293-4701-b7b3-81d6fa32c0d0\") " pod="openshift-authentication/oauth-openshift-7444bbcf85-bqpnk" Feb 24 10:13:41 crc kubenswrapper[4985]: I0224 10:13:41.293776 4985 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/3b99473d-c293-4701-b7b3-81d6fa32c0d0-v4-0-config-system-serving-cert\") pod \"oauth-openshift-7444bbcf85-bqpnk\" (UID: \"3b99473d-c293-4701-b7b3-81d6fa32c0d0\") " pod="openshift-authentication/oauth-openshift-7444bbcf85-bqpnk" Feb 24 10:13:41 crc kubenswrapper[4985]: I0224 10:13:41.296365 4985 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/3b99473d-c293-4701-b7b3-81d6fa32c0d0-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-7444bbcf85-bqpnk\" (UID: \"3b99473d-c293-4701-b7b3-81d6fa32c0d0\") " pod="openshift-authentication/oauth-openshift-7444bbcf85-bqpnk" Feb 24 10:13:41 crc kubenswrapper[4985]: I0224 10:13:41.297143 4985 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/3b99473d-c293-4701-b7b3-81d6fa32c0d0-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-7444bbcf85-bqpnk\" (UID: \"3b99473d-c293-4701-b7b3-81d6fa32c0d0\") " pod="openshift-authentication/oauth-openshift-7444bbcf85-bqpnk" Feb 24 10:13:41 crc kubenswrapper[4985]: I0224 10:13:41.298208 4985 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/3b99473d-c293-4701-b7b3-81d6fa32c0d0-v4-0-config-user-template-login\") pod \"oauth-openshift-7444bbcf85-bqpnk\" (UID: \"3b99473d-c293-4701-b7b3-81d6fa32c0d0\") " pod="openshift-authentication/oauth-openshift-7444bbcf85-bqpnk" Feb 24 10:13:41 crc kubenswrapper[4985]: I0224 10:13:41.298513 4985 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c5whm\" (UniqueName: \"kubernetes.io/projected/3b99473d-c293-4701-b7b3-81d6fa32c0d0-kube-api-access-c5whm\") pod \"oauth-openshift-7444bbcf85-bqpnk\" (UID: \"3b99473d-c293-4701-b7b3-81d6fa32c0d0\") " pod="openshift-authentication/oauth-openshift-7444bbcf85-bqpnk" Feb 24 10:13:41 crc kubenswrapper[4985]: I0224 10:13:41.370115 4985 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-7444bbcf85-bqpnk" Feb 24 10:13:42 crc kubenswrapper[4985]: I0224 10:13:42.009978 4985 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-7444bbcf85-bqpnk"] Feb 24 10:13:42 crc kubenswrapper[4985]: W0224 10:13:42.019014 4985 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3b99473d_c293_4701_b7b3_81d6fa32c0d0.slice/crio-04cc21b75eb90ec08d88b2cda54224a9764d3dc16b2b39ff29af7717f2ed21bb WatchSource:0}: Error finding container 04cc21b75eb90ec08d88b2cda54224a9764d3dc16b2b39ff29af7717f2ed21bb: Status 404 returned error can't find the container with id 04cc21b75eb90ec08d88b2cda54224a9764d3dc16b2b39ff29af7717f2ed21bb Feb 24 10:13:42 crc kubenswrapper[4985]: I0224 10:13:42.307279 4985 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-7444bbcf85-bqpnk" event={"ID":"3b99473d-c293-4701-b7b3-81d6fa32c0d0","Type":"ContainerStarted","Data":"11e4eb4c72ffaa584d79dcbdb9f765f48b5ff9a820a45f42518755e76f646306"} Feb 24 10:13:42 crc kubenswrapper[4985]: I0224 10:13:42.307605 4985 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-7444bbcf85-bqpnk" event={"ID":"3b99473d-c293-4701-b7b3-81d6fa32c0d0","Type":"ContainerStarted","Data":"04cc21b75eb90ec08d88b2cda54224a9764d3dc16b2b39ff29af7717f2ed21bb"} Feb 24 10:13:42 crc kubenswrapper[4985]: I0224 10:13:42.337206 4985 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-7444bbcf85-bqpnk" podStartSLOduration=29.337183754 podStartE2EDuration="29.337183754s" podCreationTimestamp="2026-02-24 10:13:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-24 10:13:42.335970649 +0000 UTC m=+306.810163209" watchObservedRunningTime="2026-02-24 10:13:42.337183754 +0000 UTC m=+306.811376314" Feb 24 10:13:43 crc kubenswrapper[4985]: I0224 10:13:43.313162 4985 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-7444bbcf85-bqpnk" Feb 24 10:13:43 crc kubenswrapper[4985]: I0224 10:13:43.322580 4985 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-7444bbcf85-bqpnk" Feb 24 10:13:44 crc kubenswrapper[4985]: I0224 10:13:44.800466 4985 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Feb 24 10:13:44 crc kubenswrapper[4985]: I0224 10:13:44.801282 4985 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Feb 24 10:13:44 crc kubenswrapper[4985]: I0224 10:13:44.801415 4985 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 24 10:13:44 crc kubenswrapper[4985]: I0224 10:13:44.801600 4985 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" containerID="cri-o://8f9134e7cde3d6c701f928934ed4de91ee13bac8bc62dde5df6fe535a219fb57" gracePeriod=15 Feb 24 10:13:44 crc kubenswrapper[4985]: I0224 10:13:44.801678 4985 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" containerID="cri-o://ba53e3e1dfd92b0155e60254f21e53e06da9929560934d7bd95624f4a2bb619c" gracePeriod=15 Feb 24 10:13:44 crc kubenswrapper[4985]: I0224 10:13:44.801719 4985 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" containerID="cri-o://dea2225b9a0112117f2f0bd6038ef636ac20e0cfcf3d608b720ad7aa2cacfa7c" gracePeriod=15 Feb 24 10:13:44 crc kubenswrapper[4985]: I0224 10:13:44.801736 4985 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" containerID="cri-o://7d059ea80117284da8c7eb82e4cd878eabba8f9d96102965708b9f6e0f8e96eb" gracePeriod=15 Feb 24 10:13:44 crc kubenswrapper[4985]: I0224 10:13:44.802375 4985 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" containerID="cri-o://5cf6a4886c046added0a4380f1ca023def94c15e9d5c0ed472c714632684f158" gracePeriod=15 Feb 24 10:13:44 crc kubenswrapper[4985]: I0224 10:13:44.803544 4985 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Feb 24 10:13:44 crc kubenswrapper[4985]: E0224 10:13:44.803735 4985 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 24 10:13:44 crc kubenswrapper[4985]: I0224 10:13:44.803751 4985 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 24 10:13:44 crc kubenswrapper[4985]: E0224 10:13:44.803759 4985 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 24 10:13:44 crc kubenswrapper[4985]: I0224 10:13:44.803765 4985 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 24 10:13:44 crc kubenswrapper[4985]: E0224 10:13:44.803773 4985 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Feb 24 10:13:44 crc kubenswrapper[4985]: I0224 10:13:44.803780 4985 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Feb 24 10:13:44 crc kubenswrapper[4985]: E0224 10:13:44.803786 4985 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Feb 24 10:13:44 crc kubenswrapper[4985]: I0224 10:13:44.803792 4985 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Feb 24 10:13:44 crc kubenswrapper[4985]: E0224 10:13:44.803802 4985 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 24 10:13:44 crc kubenswrapper[4985]: I0224 10:13:44.803807 4985 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 24 10:13:44 crc kubenswrapper[4985]: E0224 10:13:44.803815 4985 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Feb 24 10:13:44 crc kubenswrapper[4985]: I0224 10:13:44.803820 4985 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Feb 24 10:13:44 crc kubenswrapper[4985]: E0224 10:13:44.803828 4985 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Feb 24 10:13:44 crc kubenswrapper[4985]: I0224 10:13:44.803833 4985 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Feb 24 10:13:44 crc kubenswrapper[4985]: E0224 10:13:44.803839 4985 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 24 10:13:44 crc kubenswrapper[4985]: I0224 10:13:44.803845 4985 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 24 10:13:44 crc kubenswrapper[4985]: E0224 10:13:44.803854 4985 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Feb 24 10:13:44 crc kubenswrapper[4985]: I0224 10:13:44.803859 4985 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Feb 24 10:13:44 crc kubenswrapper[4985]: I0224 10:13:44.803968 4985 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 24 10:13:44 crc kubenswrapper[4985]: I0224 10:13:44.803977 4985 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 24 10:13:44 crc kubenswrapper[4985]: I0224 10:13:44.803984 4985 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Feb 24 10:13:44 crc kubenswrapper[4985]: I0224 10:13:44.803993 4985 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Feb 24 10:13:44 crc kubenswrapper[4985]: I0224 10:13:44.804023 4985 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Feb 24 10:13:44 crc kubenswrapper[4985]: I0224 10:13:44.804033 4985 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 24 10:13:44 crc kubenswrapper[4985]: I0224 10:13:44.804039 4985 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Feb 24 10:13:44 crc kubenswrapper[4985]: E0224 10:13:44.804127 4985 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 24 10:13:44 crc kubenswrapper[4985]: I0224 10:13:44.804134 4985 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 24 10:13:44 crc kubenswrapper[4985]: I0224 10:13:44.804216 4985 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 24 10:13:44 crc kubenswrapper[4985]: I0224 10:13:44.804223 4985 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 24 10:13:44 crc kubenswrapper[4985]: I0224 10:13:44.837707 4985 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Feb 24 10:13:44 crc kubenswrapper[4985]: I0224 10:13:44.870372 4985 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 24 10:13:44 crc kubenswrapper[4985]: I0224 10:13:44.870425 4985 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 24 10:13:44 crc kubenswrapper[4985]: I0224 10:13:44.870461 4985 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 24 10:13:44 crc kubenswrapper[4985]: I0224 10:13:44.870605 4985 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 24 10:13:44 crc kubenswrapper[4985]: I0224 10:13:44.870745 4985 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 24 10:13:44 crc kubenswrapper[4985]: I0224 10:13:44.870815 4985 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 24 10:13:44 crc kubenswrapper[4985]: I0224 10:13:44.870897 4985 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 24 10:13:44 crc kubenswrapper[4985]: I0224 10:13:44.870932 4985 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 24 10:13:44 crc kubenswrapper[4985]: I0224 10:13:44.972316 4985 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 24 10:13:44 crc kubenswrapper[4985]: I0224 10:13:44.972358 4985 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 24 10:13:44 crc kubenswrapper[4985]: I0224 10:13:44.972392 4985 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 24 10:13:44 crc kubenswrapper[4985]: I0224 10:13:44.972417 4985 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 24 10:13:44 crc kubenswrapper[4985]: I0224 10:13:44.972424 4985 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 24 10:13:44 crc kubenswrapper[4985]: I0224 10:13:44.972450 4985 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 24 10:13:44 crc kubenswrapper[4985]: I0224 10:13:44.972467 4985 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 24 10:13:44 crc kubenswrapper[4985]: I0224 10:13:44.972483 4985 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 24 10:13:44 crc kubenswrapper[4985]: I0224 10:13:44.972491 4985 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 24 10:13:44 crc kubenswrapper[4985]: I0224 10:13:44.972507 4985 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 24 10:13:44 crc kubenswrapper[4985]: I0224 10:13:44.972501 4985 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 24 10:13:44 crc kubenswrapper[4985]: I0224 10:13:44.972535 4985 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 24 10:13:44 crc kubenswrapper[4985]: I0224 10:13:44.972504 4985 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 24 10:13:44 crc kubenswrapper[4985]: I0224 10:13:44.972511 4985 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 24 10:13:44 crc kubenswrapper[4985]: I0224 10:13:44.972500 4985 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 24 10:13:44 crc kubenswrapper[4985]: I0224 10:13:44.972555 4985 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 24 10:13:45 crc kubenswrapper[4985]: I0224 10:13:45.140153 4985 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 24 10:13:45 crc kubenswrapper[4985]: W0224 10:13:45.166824 4985 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf85e55b1a89d02b0cb034b1ea31ed45a.slice/crio-5e06b2747862eb9172d26b5a0c5c84d2c22e16cea29000bd1fd65c8e03a3cd52 WatchSource:0}: Error finding container 5e06b2747862eb9172d26b5a0c5c84d2c22e16cea29000bd1fd65c8e03a3cd52: Status 404 returned error can't find the container with id 5e06b2747862eb9172d26b5a0c5c84d2c22e16cea29000bd1fd65c8e03a3cd52 Feb 24 10:13:45 crc kubenswrapper[4985]: E0224 10:13:45.169670 4985 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 38.102.83.181:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-crc.189727269bc1a13b openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-crc,UID:f85e55b1a89d02b0cb034b1ea31ed45a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-24 10:13:45.168978235 +0000 UTC m=+309.643170795,LastTimestamp:2026-02-24 10:13:45.168978235 +0000 UTC m=+309.643170795,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 24 10:13:45 crc kubenswrapper[4985]: I0224 10:13:45.325759 4985 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/3.log" Feb 24 10:13:45 crc kubenswrapper[4985]: I0224 10:13:45.327026 4985 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Feb 24 10:13:45 crc kubenswrapper[4985]: I0224 10:13:45.328690 4985 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="dea2225b9a0112117f2f0bd6038ef636ac20e0cfcf3d608b720ad7aa2cacfa7c" exitCode=0 Feb 24 10:13:45 crc kubenswrapper[4985]: I0224 10:13:45.328713 4985 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="5cf6a4886c046added0a4380f1ca023def94c15e9d5c0ed472c714632684f158" exitCode=0 Feb 24 10:13:45 crc kubenswrapper[4985]: I0224 10:13:45.328721 4985 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="ba53e3e1dfd92b0155e60254f21e53e06da9929560934d7bd95624f4a2bb619c" exitCode=0 Feb 24 10:13:45 crc kubenswrapper[4985]: I0224 10:13:45.328729 4985 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="7d059ea80117284da8c7eb82e4cd878eabba8f9d96102965708b9f6e0f8e96eb" exitCode=2 Feb 24 10:13:45 crc kubenswrapper[4985]: I0224 10:13:45.328779 4985 scope.go:117] "RemoveContainer" containerID="6ee8b871ab25b79ee67f4e5735673469d19cfb336696170957d2439b0bca13af" Feb 24 10:13:45 crc kubenswrapper[4985]: I0224 10:13:45.331138 4985 generic.go:334] "Generic (PLEG): container finished" podID="6e3d6d7b-c88d-4966-8a11-915441e6b482" containerID="2ff32f122e7404f1f9a41c457c304d0ef3740c19f6445a4412d1b481681df96d" exitCode=0 Feb 24 10:13:45 crc kubenswrapper[4985]: I0224 10:13:45.331264 4985 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"6e3d6d7b-c88d-4966-8a11-915441e6b482","Type":"ContainerDied","Data":"2ff32f122e7404f1f9a41c457c304d0ef3740c19f6445a4412d1b481681df96d"} Feb 24 10:13:45 crc kubenswrapper[4985]: I0224 10:13:45.331690 4985 status_manager.go:851] "Failed to get status for pod" podUID="6e3d6d7b-c88d-4966-8a11-915441e6b482" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.181:6443: connect: connection refused" Feb 24 10:13:45 crc kubenswrapper[4985]: I0224 10:13:45.331848 4985 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.181:6443: connect: connection refused" Feb 24 10:13:45 crc kubenswrapper[4985]: I0224 10:13:45.332027 4985 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.181:6443: connect: connection refused" Feb 24 10:13:45 crc kubenswrapper[4985]: I0224 10:13:45.333460 4985 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"5e06b2747862eb9172d26b5a0c5c84d2c22e16cea29000bd1fd65c8e03a3cd52"} Feb 24 10:13:45 crc kubenswrapper[4985]: I0224 10:13:45.369535 4985 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Readiness probe status=failure output="Get \"https://192.168.126.11:6443/readyz\": dial tcp 192.168.126.11:6443: connect: connection refused" start-of-body= Feb 24 10:13:45 crc kubenswrapper[4985]: I0224 10:13:45.369605 4985 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="Get \"https://192.168.126.11:6443/readyz\": dial tcp 192.168.126.11:6443: connect: connection refused" Feb 24 10:13:46 crc kubenswrapper[4985]: I0224 10:13:46.266171 4985 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.181:6443: connect: connection refused" Feb 24 10:13:46 crc kubenswrapper[4985]: I0224 10:13:46.266633 4985 status_manager.go:851] "Failed to get status for pod" podUID="6e3d6d7b-c88d-4966-8a11-915441e6b482" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.181:6443: connect: connection refused" Feb 24 10:13:46 crc kubenswrapper[4985]: I0224 10:13:46.267061 4985 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.181:6443: connect: connection refused" Feb 24 10:13:46 crc kubenswrapper[4985]: I0224 10:13:46.342616 4985 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Feb 24 10:13:46 crc kubenswrapper[4985]: I0224 10:13:46.344879 4985 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"187e9a2ff15e2fa14d2acece5ed51951e03d34788ba1d2e336cb002fe0f51e1f"} Feb 24 10:13:46 crc kubenswrapper[4985]: I0224 10:13:46.346023 4985 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.181:6443: connect: connection refused" Feb 24 10:13:46 crc kubenswrapper[4985]: I0224 10:13:46.346347 4985 status_manager.go:851] "Failed to get status for pod" podUID="6e3d6d7b-c88d-4966-8a11-915441e6b482" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.181:6443: connect: connection refused" Feb 24 10:13:46 crc kubenswrapper[4985]: I0224 10:13:46.733834 4985 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Feb 24 10:13:46 crc kubenswrapper[4985]: I0224 10:13:46.734844 4985 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.181:6443: connect: connection refused" Feb 24 10:13:46 crc kubenswrapper[4985]: I0224 10:13:46.735252 4985 status_manager.go:851] "Failed to get status for pod" podUID="6e3d6d7b-c88d-4966-8a11-915441e6b482" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.181:6443: connect: connection refused" Feb 24 10:13:46 crc kubenswrapper[4985]: I0224 10:13:46.793859 4985 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/6e3d6d7b-c88d-4966-8a11-915441e6b482-var-lock\") pod \"6e3d6d7b-c88d-4966-8a11-915441e6b482\" (UID: \"6e3d6d7b-c88d-4966-8a11-915441e6b482\") " Feb 24 10:13:46 crc kubenswrapper[4985]: I0224 10:13:46.793979 4985 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/6e3d6d7b-c88d-4966-8a11-915441e6b482-kube-api-access\") pod \"6e3d6d7b-c88d-4966-8a11-915441e6b482\" (UID: \"6e3d6d7b-c88d-4966-8a11-915441e6b482\") " Feb 24 10:13:46 crc kubenswrapper[4985]: I0224 10:13:46.794035 4985 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/6e3d6d7b-c88d-4966-8a11-915441e6b482-kubelet-dir\") pod \"6e3d6d7b-c88d-4966-8a11-915441e6b482\" (UID: \"6e3d6d7b-c88d-4966-8a11-915441e6b482\") " Feb 24 10:13:46 crc kubenswrapper[4985]: I0224 10:13:46.794271 4985 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/6e3d6d7b-c88d-4966-8a11-915441e6b482-var-lock" (OuterVolumeSpecName: "var-lock") pod "6e3d6d7b-c88d-4966-8a11-915441e6b482" (UID: "6e3d6d7b-c88d-4966-8a11-915441e6b482"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 24 10:13:46 crc kubenswrapper[4985]: I0224 10:13:46.794330 4985 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/6e3d6d7b-c88d-4966-8a11-915441e6b482-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "6e3d6d7b-c88d-4966-8a11-915441e6b482" (UID: "6e3d6d7b-c88d-4966-8a11-915441e6b482"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 24 10:13:46 crc kubenswrapper[4985]: I0224 10:13:46.794440 4985 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/6e3d6d7b-c88d-4966-8a11-915441e6b482-kubelet-dir\") on node \"crc\" DevicePath \"\"" Feb 24 10:13:46 crc kubenswrapper[4985]: I0224 10:13:46.794485 4985 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/6e3d6d7b-c88d-4966-8a11-915441e6b482-var-lock\") on node \"crc\" DevicePath \"\"" Feb 24 10:13:46 crc kubenswrapper[4985]: I0224 10:13:46.804711 4985 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6e3d6d7b-c88d-4966-8a11-915441e6b482-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "6e3d6d7b-c88d-4966-8a11-915441e6b482" (UID: "6e3d6d7b-c88d-4966-8a11-915441e6b482"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 24 10:13:46 crc kubenswrapper[4985]: I0224 10:13:46.896774 4985 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/6e3d6d7b-c88d-4966-8a11-915441e6b482-kube-api-access\") on node \"crc\" DevicePath \"\"" Feb 24 10:13:47 crc kubenswrapper[4985]: I0224 10:13:47.158426 4985 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Feb 24 10:13:47 crc kubenswrapper[4985]: I0224 10:13:47.159554 4985 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 24 10:13:47 crc kubenswrapper[4985]: I0224 10:13:47.160146 4985 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.181:6443: connect: connection refused" Feb 24 10:13:47 crc kubenswrapper[4985]: I0224 10:13:47.160448 4985 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.181:6443: connect: connection refused" Feb 24 10:13:47 crc kubenswrapper[4985]: I0224 10:13:47.160868 4985 status_manager.go:851] "Failed to get status for pod" podUID="6e3d6d7b-c88d-4966-8a11-915441e6b482" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.181:6443: connect: connection refused" Feb 24 10:13:47 crc kubenswrapper[4985]: I0224 10:13:47.200286 4985 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Feb 24 10:13:47 crc kubenswrapper[4985]: I0224 10:13:47.200503 4985 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Feb 24 10:13:47 crc kubenswrapper[4985]: I0224 10:13:47.200539 4985 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Feb 24 10:13:47 crc kubenswrapper[4985]: I0224 10:13:47.200605 4985 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 24 10:13:47 crc kubenswrapper[4985]: I0224 10:13:47.200725 4985 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir" (OuterVolumeSpecName: "cert-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "cert-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 24 10:13:47 crc kubenswrapper[4985]: I0224 10:13:47.200845 4985 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 24 10:13:47 crc kubenswrapper[4985]: I0224 10:13:47.200977 4985 reconciler_common.go:293] "Volume detached for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") on node \"crc\" DevicePath \"\"" Feb 24 10:13:47 crc kubenswrapper[4985]: I0224 10:13:47.200994 4985 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") on node \"crc\" DevicePath \"\"" Feb 24 10:13:47 crc kubenswrapper[4985]: I0224 10:13:47.303045 4985 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") on node \"crc\" DevicePath \"\"" Feb 24 10:13:47 crc kubenswrapper[4985]: I0224 10:13:47.352654 4985 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Feb 24 10:13:47 crc kubenswrapper[4985]: I0224 10:13:47.353544 4985 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="8f9134e7cde3d6c701f928934ed4de91ee13bac8bc62dde5df6fe535a219fb57" exitCode=0 Feb 24 10:13:47 crc kubenswrapper[4985]: I0224 10:13:47.353629 4985 scope.go:117] "RemoveContainer" containerID="dea2225b9a0112117f2f0bd6038ef636ac20e0cfcf3d608b720ad7aa2cacfa7c" Feb 24 10:13:47 crc kubenswrapper[4985]: I0224 10:13:47.353962 4985 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 24 10:13:47 crc kubenswrapper[4985]: I0224 10:13:47.355484 4985 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"6e3d6d7b-c88d-4966-8a11-915441e6b482","Type":"ContainerDied","Data":"5d9ef0afd9f506a72c0e916f93b3d6a6f3d6cad0dc71a133f13bc28b52c9a22c"} Feb 24 10:13:47 crc kubenswrapper[4985]: I0224 10:13:47.355525 4985 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Feb 24 10:13:47 crc kubenswrapper[4985]: I0224 10:13:47.355537 4985 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5d9ef0afd9f506a72c0e916f93b3d6a6f3d6cad0dc71a133f13bc28b52c9a22c" Feb 24 10:13:47 crc kubenswrapper[4985]: I0224 10:13:47.388737 4985 scope.go:117] "RemoveContainer" containerID="5cf6a4886c046added0a4380f1ca023def94c15e9d5c0ed472c714632684f158" Feb 24 10:13:47 crc kubenswrapper[4985]: I0224 10:13:47.389829 4985 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.181:6443: connect: connection refused" Feb 24 10:13:47 crc kubenswrapper[4985]: I0224 10:13:47.390727 4985 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.181:6443: connect: connection refused" Feb 24 10:13:47 crc kubenswrapper[4985]: I0224 10:13:47.391791 4985 status_manager.go:851] "Failed to get status for pod" podUID="6e3d6d7b-c88d-4966-8a11-915441e6b482" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.181:6443: connect: connection refused" Feb 24 10:13:47 crc kubenswrapper[4985]: I0224 10:13:47.392545 4985 status_manager.go:851] "Failed to get status for pod" podUID="6e3d6d7b-c88d-4966-8a11-915441e6b482" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.181:6443: connect: connection refused" Feb 24 10:13:47 crc kubenswrapper[4985]: I0224 10:13:47.393260 4985 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.181:6443: connect: connection refused" Feb 24 10:13:47 crc kubenswrapper[4985]: I0224 10:13:47.393857 4985 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.181:6443: connect: connection refused" Feb 24 10:13:47 crc kubenswrapper[4985]: I0224 10:13:47.407208 4985 scope.go:117] "RemoveContainer" containerID="ba53e3e1dfd92b0155e60254f21e53e06da9929560934d7bd95624f4a2bb619c" Feb 24 10:13:47 crc kubenswrapper[4985]: I0224 10:13:47.436422 4985 scope.go:117] "RemoveContainer" containerID="7d059ea80117284da8c7eb82e4cd878eabba8f9d96102965708b9f6e0f8e96eb" Feb 24 10:13:47 crc kubenswrapper[4985]: I0224 10:13:47.456566 4985 scope.go:117] "RemoveContainer" containerID="8f9134e7cde3d6c701f928934ed4de91ee13bac8bc62dde5df6fe535a219fb57" Feb 24 10:13:47 crc kubenswrapper[4985]: I0224 10:13:47.477050 4985 scope.go:117] "RemoveContainer" containerID="8da8e0d5478ae714a89c61cf4ca4142bc433bca4d4e4749f069c706abfed8c8f" Feb 24 10:13:47 crc kubenswrapper[4985]: I0224 10:13:47.510972 4985 scope.go:117] "RemoveContainer" containerID="dea2225b9a0112117f2f0bd6038ef636ac20e0cfcf3d608b720ad7aa2cacfa7c" Feb 24 10:13:47 crc kubenswrapper[4985]: E0224 10:13:47.511409 4985 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dea2225b9a0112117f2f0bd6038ef636ac20e0cfcf3d608b720ad7aa2cacfa7c\": container with ID starting with dea2225b9a0112117f2f0bd6038ef636ac20e0cfcf3d608b720ad7aa2cacfa7c not found: ID does not exist" containerID="dea2225b9a0112117f2f0bd6038ef636ac20e0cfcf3d608b720ad7aa2cacfa7c" Feb 24 10:13:47 crc kubenswrapper[4985]: I0224 10:13:47.511442 4985 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dea2225b9a0112117f2f0bd6038ef636ac20e0cfcf3d608b720ad7aa2cacfa7c"} err="failed to get container status \"dea2225b9a0112117f2f0bd6038ef636ac20e0cfcf3d608b720ad7aa2cacfa7c\": rpc error: code = NotFound desc = could not find container \"dea2225b9a0112117f2f0bd6038ef636ac20e0cfcf3d608b720ad7aa2cacfa7c\": container with ID starting with dea2225b9a0112117f2f0bd6038ef636ac20e0cfcf3d608b720ad7aa2cacfa7c not found: ID does not exist" Feb 24 10:13:47 crc kubenswrapper[4985]: I0224 10:13:47.511462 4985 scope.go:117] "RemoveContainer" containerID="5cf6a4886c046added0a4380f1ca023def94c15e9d5c0ed472c714632684f158" Feb 24 10:13:47 crc kubenswrapper[4985]: E0224 10:13:47.511788 4985 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5cf6a4886c046added0a4380f1ca023def94c15e9d5c0ed472c714632684f158\": container with ID starting with 5cf6a4886c046added0a4380f1ca023def94c15e9d5c0ed472c714632684f158 not found: ID does not exist" containerID="5cf6a4886c046added0a4380f1ca023def94c15e9d5c0ed472c714632684f158" Feb 24 10:13:47 crc kubenswrapper[4985]: I0224 10:13:47.511809 4985 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5cf6a4886c046added0a4380f1ca023def94c15e9d5c0ed472c714632684f158"} err="failed to get container status \"5cf6a4886c046added0a4380f1ca023def94c15e9d5c0ed472c714632684f158\": rpc error: code = NotFound desc = could not find container \"5cf6a4886c046added0a4380f1ca023def94c15e9d5c0ed472c714632684f158\": container with ID starting with 5cf6a4886c046added0a4380f1ca023def94c15e9d5c0ed472c714632684f158 not found: ID does not exist" Feb 24 10:13:47 crc kubenswrapper[4985]: I0224 10:13:47.511821 4985 scope.go:117] "RemoveContainer" containerID="ba53e3e1dfd92b0155e60254f21e53e06da9929560934d7bd95624f4a2bb619c" Feb 24 10:13:47 crc kubenswrapper[4985]: E0224 10:13:47.512122 4985 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ba53e3e1dfd92b0155e60254f21e53e06da9929560934d7bd95624f4a2bb619c\": container with ID starting with ba53e3e1dfd92b0155e60254f21e53e06da9929560934d7bd95624f4a2bb619c not found: ID does not exist" containerID="ba53e3e1dfd92b0155e60254f21e53e06da9929560934d7bd95624f4a2bb619c" Feb 24 10:13:47 crc kubenswrapper[4985]: I0224 10:13:47.512197 4985 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ba53e3e1dfd92b0155e60254f21e53e06da9929560934d7bd95624f4a2bb619c"} err="failed to get container status \"ba53e3e1dfd92b0155e60254f21e53e06da9929560934d7bd95624f4a2bb619c\": rpc error: code = NotFound desc = could not find container \"ba53e3e1dfd92b0155e60254f21e53e06da9929560934d7bd95624f4a2bb619c\": container with ID starting with ba53e3e1dfd92b0155e60254f21e53e06da9929560934d7bd95624f4a2bb619c not found: ID does not exist" Feb 24 10:13:47 crc kubenswrapper[4985]: I0224 10:13:47.512240 4985 scope.go:117] "RemoveContainer" containerID="7d059ea80117284da8c7eb82e4cd878eabba8f9d96102965708b9f6e0f8e96eb" Feb 24 10:13:47 crc kubenswrapper[4985]: E0224 10:13:47.512601 4985 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7d059ea80117284da8c7eb82e4cd878eabba8f9d96102965708b9f6e0f8e96eb\": container with ID starting with 7d059ea80117284da8c7eb82e4cd878eabba8f9d96102965708b9f6e0f8e96eb not found: ID does not exist" containerID="7d059ea80117284da8c7eb82e4cd878eabba8f9d96102965708b9f6e0f8e96eb" Feb 24 10:13:47 crc kubenswrapper[4985]: I0224 10:13:47.512630 4985 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7d059ea80117284da8c7eb82e4cd878eabba8f9d96102965708b9f6e0f8e96eb"} err="failed to get container status \"7d059ea80117284da8c7eb82e4cd878eabba8f9d96102965708b9f6e0f8e96eb\": rpc error: code = NotFound desc = could not find container \"7d059ea80117284da8c7eb82e4cd878eabba8f9d96102965708b9f6e0f8e96eb\": container with ID starting with 7d059ea80117284da8c7eb82e4cd878eabba8f9d96102965708b9f6e0f8e96eb not found: ID does not exist" Feb 24 10:13:47 crc kubenswrapper[4985]: I0224 10:13:47.512646 4985 scope.go:117] "RemoveContainer" containerID="8f9134e7cde3d6c701f928934ed4de91ee13bac8bc62dde5df6fe535a219fb57" Feb 24 10:13:47 crc kubenswrapper[4985]: E0224 10:13:47.513001 4985 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8f9134e7cde3d6c701f928934ed4de91ee13bac8bc62dde5df6fe535a219fb57\": container with ID starting with 8f9134e7cde3d6c701f928934ed4de91ee13bac8bc62dde5df6fe535a219fb57 not found: ID does not exist" containerID="8f9134e7cde3d6c701f928934ed4de91ee13bac8bc62dde5df6fe535a219fb57" Feb 24 10:13:47 crc kubenswrapper[4985]: I0224 10:13:47.513024 4985 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8f9134e7cde3d6c701f928934ed4de91ee13bac8bc62dde5df6fe535a219fb57"} err="failed to get container status \"8f9134e7cde3d6c701f928934ed4de91ee13bac8bc62dde5df6fe535a219fb57\": rpc error: code = NotFound desc = could not find container \"8f9134e7cde3d6c701f928934ed4de91ee13bac8bc62dde5df6fe535a219fb57\": container with ID starting with 8f9134e7cde3d6c701f928934ed4de91ee13bac8bc62dde5df6fe535a219fb57 not found: ID does not exist" Feb 24 10:13:47 crc kubenswrapper[4985]: I0224 10:13:47.513036 4985 scope.go:117] "RemoveContainer" containerID="8da8e0d5478ae714a89c61cf4ca4142bc433bca4d4e4749f069c706abfed8c8f" Feb 24 10:13:47 crc kubenswrapper[4985]: E0224 10:13:47.513419 4985 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8da8e0d5478ae714a89c61cf4ca4142bc433bca4d4e4749f069c706abfed8c8f\": container with ID starting with 8da8e0d5478ae714a89c61cf4ca4142bc433bca4d4e4749f069c706abfed8c8f not found: ID does not exist" containerID="8da8e0d5478ae714a89c61cf4ca4142bc433bca4d4e4749f069c706abfed8c8f" Feb 24 10:13:47 crc kubenswrapper[4985]: I0224 10:13:47.513441 4985 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8da8e0d5478ae714a89c61cf4ca4142bc433bca4d4e4749f069c706abfed8c8f"} err="failed to get container status \"8da8e0d5478ae714a89c61cf4ca4142bc433bca4d4e4749f069c706abfed8c8f\": rpc error: code = NotFound desc = could not find container \"8da8e0d5478ae714a89c61cf4ca4142bc433bca4d4e4749f069c706abfed8c8f\": container with ID starting with 8da8e0d5478ae714a89c61cf4ca4142bc433bca4d4e4749f069c706abfed8c8f not found: ID does not exist" Feb 24 10:13:48 crc kubenswrapper[4985]: I0224 10:13:48.273873 4985 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f4b27818a5e8e43d0dc095d08835c792" path="/var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/volumes" Feb 24 10:13:50 crc kubenswrapper[4985]: E0224 10:13:50.087117 4985 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 38.102.83.181:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-crc.189727269bc1a13b openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-crc,UID:f85e55b1a89d02b0cb034b1ea31ed45a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-24 10:13:45.168978235 +0000 UTC m=+309.643170795,LastTimestamp:2026-02-24 10:13:45.168978235 +0000 UTC m=+309.643170795,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 24 10:13:51 crc kubenswrapper[4985]: E0224 10:13:51.252689 4985 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.181:6443: connect: connection refused" Feb 24 10:13:51 crc kubenswrapper[4985]: E0224 10:13:51.253351 4985 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.181:6443: connect: connection refused" Feb 24 10:13:51 crc kubenswrapper[4985]: E0224 10:13:51.253873 4985 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.181:6443: connect: connection refused" Feb 24 10:13:51 crc kubenswrapper[4985]: E0224 10:13:51.254398 4985 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.181:6443: connect: connection refused" Feb 24 10:13:51 crc kubenswrapper[4985]: E0224 10:13:51.254841 4985 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.181:6443: connect: connection refused" Feb 24 10:13:51 crc kubenswrapper[4985]: I0224 10:13:51.254921 4985 controller.go:115] "failed to update lease using latest lease, fallback to ensure lease" err="failed 5 attempts to update lease" Feb 24 10:13:51 crc kubenswrapper[4985]: E0224 10:13:51.255315 4985 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.181:6443: connect: connection refused" interval="200ms" Feb 24 10:13:51 crc kubenswrapper[4985]: E0224 10:13:51.456202 4985 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.181:6443: connect: connection refused" interval="400ms" Feb 24 10:13:51 crc kubenswrapper[4985]: E0224 10:13:51.857546 4985 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.181:6443: connect: connection refused" interval="800ms" Feb 24 10:13:52 crc kubenswrapper[4985]: E0224 10:13:52.658869 4985 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.181:6443: connect: connection refused" interval="1.6s" Feb 24 10:13:54 crc kubenswrapper[4985]: E0224 10:13:54.260162 4985 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.181:6443: connect: connection refused" interval="3.2s" Feb 24 10:13:56 crc kubenswrapper[4985]: I0224 10:13:56.268579 4985 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.181:6443: connect: connection refused" Feb 24 10:13:56 crc kubenswrapper[4985]: I0224 10:13:56.269655 4985 status_manager.go:851] "Failed to get status for pod" podUID="6e3d6d7b-c88d-4966-8a11-915441e6b482" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.181:6443: connect: connection refused" Feb 24 10:13:56 crc kubenswrapper[4985]: E0224 10:13:56.338489 4985 desired_state_of_world_populator.go:312] "Error processing volume" err="error processing PVC openshift-image-registry/crc-image-registry-storage: failed to fetch PVC from API server: Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-image-registry/persistentvolumeclaims/crc-image-registry-storage\": dial tcp 38.102.83.181:6443: connect: connection refused" pod="openshift-image-registry/image-registry-697d97f7c8-ggvx6" volumeName="registry-storage" Feb 24 10:13:57 crc kubenswrapper[4985]: E0224 10:13:57.463100 4985 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.181:6443: connect: connection refused" interval="6.4s" Feb 24 10:13:58 crc kubenswrapper[4985]: I0224 10:13:58.428929 4985 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/0.log" Feb 24 10:13:58 crc kubenswrapper[4985]: I0224 10:13:58.430136 4985 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Feb 24 10:13:58 crc kubenswrapper[4985]: I0224 10:13:58.430350 4985 generic.go:334] "Generic (PLEG): container finished" podID="f614b9022728cf315e60c057852e563e" containerID="171f1effb015bd364e8afd406c496b6312f61c665b54b95b11fb5b203fca4c7e" exitCode=1 Feb 24 10:13:58 crc kubenswrapper[4985]: I0224 10:13:58.430443 4985 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerDied","Data":"171f1effb015bd364e8afd406c496b6312f61c665b54b95b11fb5b203fca4c7e"} Feb 24 10:13:58 crc kubenswrapper[4985]: I0224 10:13:58.431372 4985 scope.go:117] "RemoveContainer" containerID="171f1effb015bd364e8afd406c496b6312f61c665b54b95b11fb5b203fca4c7e" Feb 24 10:13:58 crc kubenswrapper[4985]: I0224 10:13:58.431747 4985 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.181:6443: connect: connection refused" Feb 24 10:13:58 crc kubenswrapper[4985]: I0224 10:13:58.432400 4985 status_manager.go:851] "Failed to get status for pod" podUID="6e3d6d7b-c88d-4966-8a11-915441e6b482" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.181:6443: connect: connection refused" Feb 24 10:13:58 crc kubenswrapper[4985]: I0224 10:13:58.432963 4985 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.181:6443: connect: connection refused" Feb 24 10:13:59 crc kubenswrapper[4985]: I0224 10:13:59.378339 4985 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 24 10:13:59 crc kubenswrapper[4985]: I0224 10:13:59.445139 4985 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/0.log" Feb 24 10:13:59 crc kubenswrapper[4985]: I0224 10:13:59.446299 4985 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Feb 24 10:13:59 crc kubenswrapper[4985]: I0224 10:13:59.446381 4985 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"9b1de3c9189ba16e843d43548fff58d69df54e7f9791c9cb5a2d127af3bf6569"} Feb 24 10:13:59 crc kubenswrapper[4985]: I0224 10:13:59.447676 4985 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.181:6443: connect: connection refused" Feb 24 10:13:59 crc kubenswrapper[4985]: I0224 10:13:59.448361 4985 status_manager.go:851] "Failed to get status for pod" podUID="6e3d6d7b-c88d-4966-8a11-915441e6b482" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.181:6443: connect: connection refused" Feb 24 10:13:59 crc kubenswrapper[4985]: I0224 10:13:59.449048 4985 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.181:6443: connect: connection refused" Feb 24 10:14:00 crc kubenswrapper[4985]: E0224 10:14:00.088777 4985 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 38.102.83.181:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-crc.189727269bc1a13b openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-crc,UID:f85e55b1a89d02b0cb034b1ea31ed45a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-24 10:13:45.168978235 +0000 UTC m=+309.643170795,LastTimestamp:2026-02-24 10:13:45.168978235 +0000 UTC m=+309.643170795,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 24 10:14:00 crc kubenswrapper[4985]: I0224 10:14:00.264176 4985 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 24 10:14:00 crc kubenswrapper[4985]: I0224 10:14:00.266148 4985 status_manager.go:851] "Failed to get status for pod" podUID="6e3d6d7b-c88d-4966-8a11-915441e6b482" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.181:6443: connect: connection refused" Feb 24 10:14:00 crc kubenswrapper[4985]: I0224 10:14:00.266624 4985 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.181:6443: connect: connection refused" Feb 24 10:14:00 crc kubenswrapper[4985]: I0224 10:14:00.267299 4985 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.181:6443: connect: connection refused" Feb 24 10:14:00 crc kubenswrapper[4985]: I0224 10:14:00.292147 4985 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="a6395bdd-0d8e-4572-ae86-695e87aff12e" Feb 24 10:14:00 crc kubenswrapper[4985]: I0224 10:14:00.292198 4985 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="a6395bdd-0d8e-4572-ae86-695e87aff12e" Feb 24 10:14:00 crc kubenswrapper[4985]: E0224 10:14:00.292815 4985 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.181:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 24 10:14:00 crc kubenswrapper[4985]: I0224 10:14:00.293265 4985 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 24 10:14:00 crc kubenswrapper[4985]: W0224 10:14:00.327931 4985 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod71bb4a3aecc4ba5b26c4b7318770ce13.slice/crio-700c8c175be3c41e87cc54564c1906501f6ad0c397baa97621dc429db6305e9c WatchSource:0}: Error finding container 700c8c175be3c41e87cc54564c1906501f6ad0c397baa97621dc429db6305e9c: Status 404 returned error can't find the container with id 700c8c175be3c41e87cc54564c1906501f6ad0c397baa97621dc429db6305e9c Feb 24 10:14:00 crc kubenswrapper[4985]: I0224 10:14:00.452264 4985 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"700c8c175be3c41e87cc54564c1906501f6ad0c397baa97621dc429db6305e9c"} Feb 24 10:14:01 crc kubenswrapper[4985]: I0224 10:14:01.460878 4985 generic.go:334] "Generic (PLEG): container finished" podID="71bb4a3aecc4ba5b26c4b7318770ce13" containerID="ce85cce1538894b3ee416d406ed58eb54cea69f9d06af0180ef775f2c8c32af6" exitCode=0 Feb 24 10:14:01 crc kubenswrapper[4985]: I0224 10:14:01.460960 4985 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerDied","Data":"ce85cce1538894b3ee416d406ed58eb54cea69f9d06af0180ef775f2c8c32af6"} Feb 24 10:14:01 crc kubenswrapper[4985]: I0224 10:14:01.461374 4985 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="a6395bdd-0d8e-4572-ae86-695e87aff12e" Feb 24 10:14:01 crc kubenswrapper[4985]: I0224 10:14:01.461391 4985 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="a6395bdd-0d8e-4572-ae86-695e87aff12e" Feb 24 10:14:01 crc kubenswrapper[4985]: E0224 10:14:01.461758 4985 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.181:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 24 10:14:01 crc kubenswrapper[4985]: I0224 10:14:01.461761 4985 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.181:6443: connect: connection refused" Feb 24 10:14:01 crc kubenswrapper[4985]: I0224 10:14:01.462204 4985 status_manager.go:851] "Failed to get status for pod" podUID="6e3d6d7b-c88d-4966-8a11-915441e6b482" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.181:6443: connect: connection refused" Feb 24 10:14:01 crc kubenswrapper[4985]: I0224 10:14:01.462649 4985 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.181:6443: connect: connection refused" Feb 24 10:14:02 crc kubenswrapper[4985]: I0224 10:14:02.474902 4985 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"1a8884ce147fcc9eb7733904dc04d99eceb7964fae8eb43172db3e03ae46e495"} Feb 24 10:14:02 crc kubenswrapper[4985]: I0224 10:14:02.475212 4985 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"579a743de819285a14c714eb4721e2688f3cbf7b38eb526473b1aad6f5cc64fc"} Feb 24 10:14:02 crc kubenswrapper[4985]: I0224 10:14:02.475226 4985 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"dc543b49d68b56652fb7a389eb36c73f9ba68f6c709c8994c8812bdb5046575f"} Feb 24 10:14:02 crc kubenswrapper[4985]: I0224 10:14:02.475235 4985 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"3cec44e416616c31a7b3c1a63ec92b69b70b562ce9a10e06ca631534bcab1a90"} Feb 24 10:14:03 crc kubenswrapper[4985]: I0224 10:14:03.482054 4985 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"b5fa4e76393a74bcc2f5886811667b63a8bfcdb33af2981878f8a68a2d04dc18"} Feb 24 10:14:03 crc kubenswrapper[4985]: I0224 10:14:03.482306 4985 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 24 10:14:03 crc kubenswrapper[4985]: I0224 10:14:03.482354 4985 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="a6395bdd-0d8e-4572-ae86-695e87aff12e" Feb 24 10:14:03 crc kubenswrapper[4985]: I0224 10:14:03.482380 4985 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="a6395bdd-0d8e-4572-ae86-695e87aff12e" Feb 24 10:14:05 crc kubenswrapper[4985]: I0224 10:14:05.293797 4985 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 24 10:14:05 crc kubenswrapper[4985]: I0224 10:14:05.294520 4985 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 24 10:14:05 crc kubenswrapper[4985]: I0224 10:14:05.304561 4985 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 24 10:14:05 crc kubenswrapper[4985]: I0224 10:14:05.483432 4985 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 24 10:14:08 crc kubenswrapper[4985]: I0224 10:14:08.270281 4985 pod_container_manager_linux.go:210] "Failed to delete cgroup paths" cgroupName=["kubepods","burstable","pod7dfb5a3d-1117-42e6-b4e3-59afe7960139"] err="unable to destroy cgroup paths for cgroup [kubepods burstable pod7dfb5a3d-1117-42e6-b4e3-59afe7960139] : Timed out while waiting for systemd to remove kubepods-burstable-pod7dfb5a3d_1117_42e6_b4e3_59afe7960139.slice" Feb 24 10:14:08 crc kubenswrapper[4985]: E0224 10:14:08.271982 4985 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to delete cgroup paths for [kubepods burstable pod7dfb5a3d-1117-42e6-b4e3-59afe7960139] : unable to destroy cgroup paths for cgroup [kubepods burstable pod7dfb5a3d-1117-42e6-b4e3-59afe7960139] : Timed out while waiting for systemd to remove kubepods-burstable-pod7dfb5a3d_1117_42e6_b4e3_59afe7960139.slice" pod="openshift-route-controller-manager/route-controller-manager-58f755ff7c-fvhz7" podUID="7dfb5a3d-1117-42e6-b4e3-59afe7960139" Feb 24 10:14:08 crc kubenswrapper[4985]: I0224 10:14:08.494459 4985 kubelet.go:1914] "Deleted mirror pod because it is outdated" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 24 10:14:08 crc kubenswrapper[4985]: I0224 10:14:08.507487 4985 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-58f755ff7c-fvhz7" Feb 24 10:14:08 crc kubenswrapper[4985]: I0224 10:14:08.584842 4985 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="3ef17351-37b4-49cd-bdd6-a781fc2c98aa" Feb 24 10:14:09 crc kubenswrapper[4985]: I0224 10:14:09.378929 4985 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 24 10:14:09 crc kubenswrapper[4985]: I0224 10:14:09.384007 4985 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 24 10:14:09 crc kubenswrapper[4985]: I0224 10:14:09.514551 4985 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="a6395bdd-0d8e-4572-ae86-695e87aff12e" Feb 24 10:14:09 crc kubenswrapper[4985]: I0224 10:14:09.514608 4985 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="a6395bdd-0d8e-4572-ae86-695e87aff12e" Feb 24 10:14:09 crc kubenswrapper[4985]: I0224 10:14:09.518710 4985 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="3ef17351-37b4-49cd-bdd6-a781fc2c98aa" Feb 24 10:14:09 crc kubenswrapper[4985]: I0224 10:14:09.519438 4985 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 24 10:14:17 crc kubenswrapper[4985]: I0224 10:14:17.796317 4985 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Feb 24 10:14:18 crc kubenswrapper[4985]: I0224 10:14:18.008693 4985 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Feb 24 10:14:18 crc kubenswrapper[4985]: I0224 10:14:18.263035 4985 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Feb 24 10:14:18 crc kubenswrapper[4985]: I0224 10:14:18.696277 4985 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Feb 24 10:14:18 crc kubenswrapper[4985]: I0224 10:14:18.781501 4985 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Feb 24 10:14:18 crc kubenswrapper[4985]: I0224 10:14:18.894791 4985 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Feb 24 10:14:18 crc kubenswrapper[4985]: I0224 10:14:18.961795 4985 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Feb 24 10:14:19 crc kubenswrapper[4985]: I0224 10:14:19.050528 4985 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Feb 24 10:14:19 crc kubenswrapper[4985]: I0224 10:14:19.166002 4985 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Feb 24 10:14:19 crc kubenswrapper[4985]: I0224 10:14:19.606496 4985 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Feb 24 10:14:19 crc kubenswrapper[4985]: I0224 10:14:19.651774 4985 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Feb 24 10:14:19 crc kubenswrapper[4985]: I0224 10:14:19.750310 4985 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Feb 24 10:14:19 crc kubenswrapper[4985]: I0224 10:14:19.828330 4985 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Feb 24 10:14:19 crc kubenswrapper[4985]: I0224 10:14:19.962201 4985 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Feb 24 10:14:20 crc kubenswrapper[4985]: I0224 10:14:20.089567 4985 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Feb 24 10:14:20 crc kubenswrapper[4985]: I0224 10:14:20.143292 4985 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Feb 24 10:14:20 crc kubenswrapper[4985]: I0224 10:14:20.160960 4985 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Feb 24 10:14:20 crc kubenswrapper[4985]: I0224 10:14:20.252642 4985 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Feb 24 10:14:20 crc kubenswrapper[4985]: I0224 10:14:20.257626 4985 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Feb 24 10:14:20 crc kubenswrapper[4985]: I0224 10:14:20.332264 4985 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Feb 24 10:14:20 crc kubenswrapper[4985]: I0224 10:14:20.648937 4985 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Feb 24 10:14:20 crc kubenswrapper[4985]: I0224 10:14:20.726407 4985 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Feb 24 10:14:20 crc kubenswrapper[4985]: I0224 10:14:20.766491 4985 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Feb 24 10:14:20 crc kubenswrapper[4985]: I0224 10:14:20.877472 4985 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Feb 24 10:14:20 crc kubenswrapper[4985]: I0224 10:14:20.910950 4985 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Feb 24 10:14:20 crc kubenswrapper[4985]: I0224 10:14:20.911738 4985 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Feb 24 10:14:21 crc kubenswrapper[4985]: I0224 10:14:21.009591 4985 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Feb 24 10:14:21 crc kubenswrapper[4985]: I0224 10:14:21.150181 4985 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Feb 24 10:14:21 crc kubenswrapper[4985]: I0224 10:14:21.285975 4985 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Feb 24 10:14:21 crc kubenswrapper[4985]: I0224 10:14:21.289605 4985 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Feb 24 10:14:21 crc kubenswrapper[4985]: I0224 10:14:21.332087 4985 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Feb 24 10:14:21 crc kubenswrapper[4985]: I0224 10:14:21.338573 4985 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Feb 24 10:14:21 crc kubenswrapper[4985]: I0224 10:14:21.339720 4985 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Feb 24 10:14:21 crc kubenswrapper[4985]: I0224 10:14:21.522840 4985 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Feb 24 10:14:21 crc kubenswrapper[4985]: I0224 10:14:21.583019 4985 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Feb 24 10:14:21 crc kubenswrapper[4985]: I0224 10:14:21.739061 4985 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Feb 24 10:14:21 crc kubenswrapper[4985]: I0224 10:14:21.766508 4985 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Feb 24 10:14:21 crc kubenswrapper[4985]: I0224 10:14:21.790697 4985 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Feb 24 10:14:21 crc kubenswrapper[4985]: I0224 10:14:21.794647 4985 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Feb 24 10:14:21 crc kubenswrapper[4985]: I0224 10:14:21.813134 4985 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Feb 24 10:14:21 crc kubenswrapper[4985]: I0224 10:14:21.823040 4985 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Feb 24 10:14:21 crc kubenswrapper[4985]: I0224 10:14:21.866082 4985 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Feb 24 10:14:21 crc kubenswrapper[4985]: I0224 10:14:21.871670 4985 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Feb 24 10:14:21 crc kubenswrapper[4985]: I0224 10:14:21.920668 4985 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Feb 24 10:14:21 crc kubenswrapper[4985]: I0224 10:14:21.949107 4985 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Feb 24 10:14:21 crc kubenswrapper[4985]: I0224 10:14:21.985367 4985 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Feb 24 10:14:22 crc kubenswrapper[4985]: I0224 10:14:22.007181 4985 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Feb 24 10:14:22 crc kubenswrapper[4985]: I0224 10:14:22.015493 4985 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Feb 24 10:14:22 crc kubenswrapper[4985]: I0224 10:14:22.192972 4985 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Feb 24 10:14:22 crc kubenswrapper[4985]: I0224 10:14:22.231078 4985 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Feb 24 10:14:22 crc kubenswrapper[4985]: I0224 10:14:22.374253 4985 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Feb 24 10:14:22 crc kubenswrapper[4985]: I0224 10:14:22.377621 4985 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Feb 24 10:14:22 crc kubenswrapper[4985]: I0224 10:14:22.423463 4985 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Feb 24 10:14:22 crc kubenswrapper[4985]: I0224 10:14:22.449780 4985 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Feb 24 10:14:22 crc kubenswrapper[4985]: I0224 10:14:22.453552 4985 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Feb 24 10:14:22 crc kubenswrapper[4985]: I0224 10:14:22.503837 4985 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Feb 24 10:14:22 crc kubenswrapper[4985]: I0224 10:14:22.520993 4985 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Feb 24 10:14:22 crc kubenswrapper[4985]: I0224 10:14:22.531614 4985 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Feb 24 10:14:22 crc kubenswrapper[4985]: I0224 10:14:22.593469 4985 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Feb 24 10:14:22 crc kubenswrapper[4985]: I0224 10:14:22.612485 4985 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Feb 24 10:14:22 crc kubenswrapper[4985]: I0224 10:14:22.640505 4985 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Feb 24 10:14:22 crc kubenswrapper[4985]: I0224 10:14:22.649814 4985 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Feb 24 10:14:22 crc kubenswrapper[4985]: I0224 10:14:22.778088 4985 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Feb 24 10:14:22 crc kubenswrapper[4985]: I0224 10:14:22.826166 4985 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Feb 24 10:14:22 crc kubenswrapper[4985]: I0224 10:14:22.970985 4985 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Feb 24 10:14:22 crc kubenswrapper[4985]: I0224 10:14:22.997807 4985 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Feb 24 10:14:23 crc kubenswrapper[4985]: I0224 10:14:23.072901 4985 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Feb 24 10:14:23 crc kubenswrapper[4985]: I0224 10:14:23.301883 4985 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Feb 24 10:14:23 crc kubenswrapper[4985]: I0224 10:14:23.347054 4985 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Feb 24 10:14:23 crc kubenswrapper[4985]: I0224 10:14:23.392511 4985 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Feb 24 10:14:23 crc kubenswrapper[4985]: I0224 10:14:23.463853 4985 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Feb 24 10:14:23 crc kubenswrapper[4985]: I0224 10:14:23.464854 4985 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Feb 24 10:14:23 crc kubenswrapper[4985]: I0224 10:14:23.572966 4985 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Feb 24 10:14:23 crc kubenswrapper[4985]: I0224 10:14:23.646529 4985 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Feb 24 10:14:23 crc kubenswrapper[4985]: I0224 10:14:23.670566 4985 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Feb 24 10:14:23 crc kubenswrapper[4985]: I0224 10:14:23.787596 4985 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Feb 24 10:14:23 crc kubenswrapper[4985]: I0224 10:14:23.888350 4985 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Feb 24 10:14:24 crc kubenswrapper[4985]: I0224 10:14:24.002953 4985 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Feb 24 10:14:24 crc kubenswrapper[4985]: I0224 10:14:24.064783 4985 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Feb 24 10:14:24 crc kubenswrapper[4985]: I0224 10:14:24.084281 4985 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Feb 24 10:14:24 crc kubenswrapper[4985]: I0224 10:14:24.160447 4985 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Feb 24 10:14:24 crc kubenswrapper[4985]: I0224 10:14:24.202552 4985 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Feb 24 10:14:24 crc kubenswrapper[4985]: I0224 10:14:24.202645 4985 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Feb 24 10:14:24 crc kubenswrapper[4985]: I0224 10:14:24.288163 4985 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Feb 24 10:14:24 crc kubenswrapper[4985]: I0224 10:14:24.332381 4985 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Feb 24 10:14:24 crc kubenswrapper[4985]: I0224 10:14:24.486165 4985 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Feb 24 10:14:24 crc kubenswrapper[4985]: I0224 10:14:24.486347 4985 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Feb 24 10:14:24 crc kubenswrapper[4985]: I0224 10:14:24.491861 4985 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Feb 24 10:14:24 crc kubenswrapper[4985]: I0224 10:14:24.581625 4985 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Feb 24 10:14:24 crc kubenswrapper[4985]: I0224 10:14:24.586016 4985 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podStartSLOduration=40.585990877 podStartE2EDuration="40.585990877s" podCreationTimestamp="2026-02-24 10:13:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-24 10:14:08.511269053 +0000 UTC m=+332.985461613" watchObservedRunningTime="2026-02-24 10:14:24.585990877 +0000 UTC m=+349.060183477" Feb 24 10:14:24 crc kubenswrapper[4985]: I0224 10:14:24.589250 4985 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-58f755ff7c-fvhz7","openshift-kube-apiserver/kube-apiserver-crc"] Feb 24 10:14:24 crc kubenswrapper[4985]: I0224 10:14:24.589329 4985 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Feb 24 10:14:24 crc kubenswrapper[4985]: I0224 10:14:24.593757 4985 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 24 10:14:24 crc kubenswrapper[4985]: I0224 10:14:24.597549 4985 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 24 10:14:24 crc kubenswrapper[4985]: I0224 10:14:24.620151 4985 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=16.620122403 podStartE2EDuration="16.620122403s" podCreationTimestamp="2026-02-24 10:14:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-24 10:14:24.611397994 +0000 UTC m=+349.085590644" watchObservedRunningTime="2026-02-24 10:14:24.620122403 +0000 UTC m=+349.094315003" Feb 24 10:14:24 crc kubenswrapper[4985]: I0224 10:14:24.626875 4985 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Feb 24 10:14:24 crc kubenswrapper[4985]: I0224 10:14:24.733908 4985 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Feb 24 10:14:24 crc kubenswrapper[4985]: I0224 10:14:24.746088 4985 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Feb 24 10:14:24 crc kubenswrapper[4985]: I0224 10:14:24.775357 4985 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Feb 24 10:14:24 crc kubenswrapper[4985]: I0224 10:14:24.838323 4985 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Feb 24 10:14:24 crc kubenswrapper[4985]: I0224 10:14:24.870869 4985 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Feb 24 10:14:24 crc kubenswrapper[4985]: I0224 10:14:24.881162 4985 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Feb 24 10:14:24 crc kubenswrapper[4985]: I0224 10:14:24.983152 4985 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Feb 24 10:14:25 crc kubenswrapper[4985]: I0224 10:14:25.063204 4985 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Feb 24 10:14:25 crc kubenswrapper[4985]: I0224 10:14:25.083483 4985 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Feb 24 10:14:25 crc kubenswrapper[4985]: I0224 10:14:25.183310 4985 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Feb 24 10:14:25 crc kubenswrapper[4985]: I0224 10:14:25.255111 4985 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Feb 24 10:14:25 crc kubenswrapper[4985]: I0224 10:14:25.347090 4985 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Feb 24 10:14:25 crc kubenswrapper[4985]: I0224 10:14:25.400568 4985 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Feb 24 10:14:25 crc kubenswrapper[4985]: I0224 10:14:25.429188 4985 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Feb 24 10:14:25 crc kubenswrapper[4985]: I0224 10:14:25.446879 4985 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Feb 24 10:14:25 crc kubenswrapper[4985]: I0224 10:14:25.471444 4985 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Feb 24 10:14:25 crc kubenswrapper[4985]: I0224 10:14:25.539918 4985 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Feb 24 10:14:25 crc kubenswrapper[4985]: I0224 10:14:25.630083 4985 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Feb 24 10:14:25 crc kubenswrapper[4985]: I0224 10:14:25.670620 4985 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Feb 24 10:14:25 crc kubenswrapper[4985]: I0224 10:14:25.710238 4985 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Feb 24 10:14:25 crc kubenswrapper[4985]: I0224 10:14:25.726626 4985 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Feb 24 10:14:25 crc kubenswrapper[4985]: I0224 10:14:25.816529 4985 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Feb 24 10:14:25 crc kubenswrapper[4985]: I0224 10:14:25.854039 4985 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Feb 24 10:14:25 crc kubenswrapper[4985]: I0224 10:14:25.893629 4985 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Feb 24 10:14:25 crc kubenswrapper[4985]: I0224 10:14:25.965074 4985 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Feb 24 10:14:26 crc kubenswrapper[4985]: I0224 10:14:26.123946 4985 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Feb 24 10:14:26 crc kubenswrapper[4985]: I0224 10:14:26.140336 4985 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Feb 24 10:14:26 crc kubenswrapper[4985]: I0224 10:14:26.153776 4985 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Feb 24 10:14:26 crc kubenswrapper[4985]: I0224 10:14:26.162085 4985 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Feb 24 10:14:26 crc kubenswrapper[4985]: I0224 10:14:26.246481 4985 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Feb 24 10:14:26 crc kubenswrapper[4985]: I0224 10:14:26.276618 4985 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7dfb5a3d-1117-42e6-b4e3-59afe7960139" path="/var/lib/kubelet/pods/7dfb5a3d-1117-42e6-b4e3-59afe7960139/volumes" Feb 24 10:14:26 crc kubenswrapper[4985]: I0224 10:14:26.288488 4985 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Feb 24 10:14:26 crc kubenswrapper[4985]: I0224 10:14:26.300783 4985 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Feb 24 10:14:26 crc kubenswrapper[4985]: I0224 10:14:26.325644 4985 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Feb 24 10:14:26 crc kubenswrapper[4985]: I0224 10:14:26.355065 4985 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Feb 24 10:14:26 crc kubenswrapper[4985]: I0224 10:14:26.437530 4985 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Feb 24 10:14:26 crc kubenswrapper[4985]: I0224 10:14:26.451833 4985 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Feb 24 10:14:26 crc kubenswrapper[4985]: I0224 10:14:26.579508 4985 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Feb 24 10:14:26 crc kubenswrapper[4985]: I0224 10:14:26.586137 4985 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Feb 24 10:14:26 crc kubenswrapper[4985]: I0224 10:14:26.637662 4985 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Feb 24 10:14:26 crc kubenswrapper[4985]: I0224 10:14:26.681773 4985 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Feb 24 10:14:26 crc kubenswrapper[4985]: I0224 10:14:26.716587 4985 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Feb 24 10:14:26 crc kubenswrapper[4985]: I0224 10:14:26.830645 4985 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Feb 24 10:14:27 crc kubenswrapper[4985]: I0224 10:14:27.097813 4985 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Feb 24 10:14:27 crc kubenswrapper[4985]: I0224 10:14:27.170169 4985 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Feb 24 10:14:27 crc kubenswrapper[4985]: I0224 10:14:27.184724 4985 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Feb 24 10:14:27 crc kubenswrapper[4985]: I0224 10:14:27.190652 4985 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Feb 24 10:14:27 crc kubenswrapper[4985]: I0224 10:14:27.193962 4985 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Feb 24 10:14:27 crc kubenswrapper[4985]: I0224 10:14:27.205962 4985 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Feb 24 10:14:27 crc kubenswrapper[4985]: I0224 10:14:27.216510 4985 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Feb 24 10:14:27 crc kubenswrapper[4985]: I0224 10:14:27.285766 4985 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Feb 24 10:14:27 crc kubenswrapper[4985]: I0224 10:14:27.305429 4985 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Feb 24 10:14:27 crc kubenswrapper[4985]: I0224 10:14:27.338293 4985 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Feb 24 10:14:27 crc kubenswrapper[4985]: I0224 10:14:27.353362 4985 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Feb 24 10:14:27 crc kubenswrapper[4985]: I0224 10:14:27.353485 4985 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Feb 24 10:14:27 crc kubenswrapper[4985]: I0224 10:14:27.366699 4985 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Feb 24 10:14:27 crc kubenswrapper[4985]: I0224 10:14:27.380998 4985 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Feb 24 10:14:27 crc kubenswrapper[4985]: I0224 10:14:27.439426 4985 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Feb 24 10:14:27 crc kubenswrapper[4985]: I0224 10:14:27.472675 4985 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Feb 24 10:14:27 crc kubenswrapper[4985]: I0224 10:14:27.533704 4985 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Feb 24 10:14:27 crc kubenswrapper[4985]: I0224 10:14:27.543934 4985 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Feb 24 10:14:27 crc kubenswrapper[4985]: I0224 10:14:27.550437 4985 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Feb 24 10:14:27 crc kubenswrapper[4985]: I0224 10:14:27.634772 4985 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Feb 24 10:14:27 crc kubenswrapper[4985]: I0224 10:14:27.640608 4985 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Feb 24 10:14:27 crc kubenswrapper[4985]: I0224 10:14:27.647978 4985 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Feb 24 10:14:27 crc kubenswrapper[4985]: I0224 10:14:27.722646 4985 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Feb 24 10:14:27 crc kubenswrapper[4985]: I0224 10:14:27.840418 4985 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Feb 24 10:14:27 crc kubenswrapper[4985]: I0224 10:14:27.880601 4985 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Feb 24 10:14:27 crc kubenswrapper[4985]: I0224 10:14:27.941142 4985 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Feb 24 10:14:28 crc kubenswrapper[4985]: I0224 10:14:28.020637 4985 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Feb 24 10:14:28 crc kubenswrapper[4985]: I0224 10:14:28.021444 4985 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Feb 24 10:14:28 crc kubenswrapper[4985]: I0224 10:14:28.033071 4985 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Feb 24 10:14:28 crc kubenswrapper[4985]: I0224 10:14:28.107415 4985 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Feb 24 10:14:28 crc kubenswrapper[4985]: I0224 10:14:28.120662 4985 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Feb 24 10:14:28 crc kubenswrapper[4985]: I0224 10:14:28.184289 4985 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Feb 24 10:14:28 crc kubenswrapper[4985]: I0224 10:14:28.193316 4985 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Feb 24 10:14:28 crc kubenswrapper[4985]: I0224 10:14:28.197571 4985 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Feb 24 10:14:28 crc kubenswrapper[4985]: I0224 10:14:28.235478 4985 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Feb 24 10:14:28 crc kubenswrapper[4985]: I0224 10:14:28.285048 4985 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Feb 24 10:14:28 crc kubenswrapper[4985]: I0224 10:14:28.388600 4985 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Feb 24 10:14:28 crc kubenswrapper[4985]: I0224 10:14:28.478138 4985 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Feb 24 10:14:28 crc kubenswrapper[4985]: I0224 10:14:28.679993 4985 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Feb 24 10:14:28 crc kubenswrapper[4985]: I0224 10:14:28.884926 4985 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Feb 24 10:14:28 crc kubenswrapper[4985]: I0224 10:14:28.909955 4985 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Feb 24 10:14:28 crc kubenswrapper[4985]: I0224 10:14:28.914495 4985 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Feb 24 10:14:28 crc kubenswrapper[4985]: I0224 10:14:28.935572 4985 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Feb 24 10:14:28 crc kubenswrapper[4985]: I0224 10:14:28.984667 4985 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Feb 24 10:14:29 crc kubenswrapper[4985]: I0224 10:14:29.011824 4985 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Feb 24 10:14:29 crc kubenswrapper[4985]: I0224 10:14:29.198640 4985 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Feb 24 10:14:29 crc kubenswrapper[4985]: I0224 10:14:29.287112 4985 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Feb 24 10:14:29 crc kubenswrapper[4985]: I0224 10:14:29.344493 4985 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Feb 24 10:14:29 crc kubenswrapper[4985]: I0224 10:14:29.433448 4985 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Feb 24 10:14:29 crc kubenswrapper[4985]: I0224 10:14:29.483466 4985 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Feb 24 10:14:29 crc kubenswrapper[4985]: I0224 10:14:29.607204 4985 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Feb 24 10:14:29 crc kubenswrapper[4985]: I0224 10:14:29.634711 4985 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Feb 24 10:14:29 crc kubenswrapper[4985]: I0224 10:14:29.951525 4985 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Feb 24 10:14:29 crc kubenswrapper[4985]: I0224 10:14:29.957756 4985 reflector.go:368] Caches populated for *v1.CertificateSigningRequest from k8s.io/client-go/tools/watch/informerwatcher.go:146 Feb 24 10:14:29 crc kubenswrapper[4985]: I0224 10:14:29.971805 4985 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Feb 24 10:14:30 crc kubenswrapper[4985]: I0224 10:14:30.014367 4985 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Feb 24 10:14:30 crc kubenswrapper[4985]: I0224 10:14:30.041921 4985 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Feb 24 10:14:30 crc kubenswrapper[4985]: I0224 10:14:30.062261 4985 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Feb 24 10:14:30 crc kubenswrapper[4985]: I0224 10:14:30.130852 4985 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Feb 24 10:14:30 crc kubenswrapper[4985]: I0224 10:14:30.331339 4985 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Feb 24 10:14:30 crc kubenswrapper[4985]: I0224 10:14:30.380497 4985 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Feb 24 10:14:30 crc kubenswrapper[4985]: I0224 10:14:30.382334 4985 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Feb 24 10:14:30 crc kubenswrapper[4985]: I0224 10:14:30.399773 4985 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Feb 24 10:14:30 crc kubenswrapper[4985]: I0224 10:14:30.504912 4985 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Feb 24 10:14:30 crc kubenswrapper[4985]: I0224 10:14:30.515812 4985 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Feb 24 10:14:30 crc kubenswrapper[4985]: I0224 10:14:30.520199 4985 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Feb 24 10:14:30 crc kubenswrapper[4985]: I0224 10:14:30.578086 4985 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Feb 24 10:14:30 crc kubenswrapper[4985]: I0224 10:14:30.622550 4985 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Feb 24 10:14:30 crc kubenswrapper[4985]: I0224 10:14:30.651908 4985 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Feb 24 10:14:30 crc kubenswrapper[4985]: I0224 10:14:30.738511 4985 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Feb 24 10:14:30 crc kubenswrapper[4985]: I0224 10:14:30.793250 4985 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Feb 24 10:14:30 crc kubenswrapper[4985]: I0224 10:14:30.793619 4985 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" containerID="cri-o://187e9a2ff15e2fa14d2acece5ed51951e03d34788ba1d2e336cb002fe0f51e1f" gracePeriod=5 Feb 24 10:14:30 crc kubenswrapper[4985]: I0224 10:14:30.855456 4985 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Feb 24 10:14:30 crc kubenswrapper[4985]: I0224 10:14:30.873761 4985 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Feb 24 10:14:30 crc kubenswrapper[4985]: I0224 10:14:30.885719 4985 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Feb 24 10:14:31 crc kubenswrapper[4985]: I0224 10:14:31.051676 4985 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Feb 24 10:14:31 crc kubenswrapper[4985]: I0224 10:14:31.102655 4985 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Feb 24 10:14:31 crc kubenswrapper[4985]: I0224 10:14:31.226660 4985 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Feb 24 10:14:31 crc kubenswrapper[4985]: I0224 10:14:31.248798 4985 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Feb 24 10:14:31 crc kubenswrapper[4985]: I0224 10:14:31.457748 4985 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Feb 24 10:14:31 crc kubenswrapper[4985]: I0224 10:14:31.496390 4985 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Feb 24 10:14:31 crc kubenswrapper[4985]: I0224 10:14:31.508938 4985 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Feb 24 10:14:31 crc kubenswrapper[4985]: I0224 10:14:31.653825 4985 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Feb 24 10:14:31 crc kubenswrapper[4985]: I0224 10:14:31.654532 4985 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Feb 24 10:14:31 crc kubenswrapper[4985]: I0224 10:14:31.666233 4985 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Feb 24 10:14:31 crc kubenswrapper[4985]: I0224 10:14:31.682706 4985 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Feb 24 10:14:31 crc kubenswrapper[4985]: I0224 10:14:31.694988 4985 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Feb 24 10:14:31 crc kubenswrapper[4985]: I0224 10:14:31.952409 4985 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Feb 24 10:14:32 crc kubenswrapper[4985]: I0224 10:14:32.026364 4985 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Feb 24 10:14:32 crc kubenswrapper[4985]: I0224 10:14:32.129455 4985 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Feb 24 10:14:32 crc kubenswrapper[4985]: I0224 10:14:32.170944 4985 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Feb 24 10:14:32 crc kubenswrapper[4985]: I0224 10:14:32.312531 4985 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Feb 24 10:14:32 crc kubenswrapper[4985]: I0224 10:14:32.385423 4985 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Feb 24 10:14:32 crc kubenswrapper[4985]: I0224 10:14:32.722739 4985 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Feb 24 10:14:32 crc kubenswrapper[4985]: I0224 10:14:32.730754 4985 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Feb 24 10:14:32 crc kubenswrapper[4985]: I0224 10:14:32.761333 4985 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Feb 24 10:14:32 crc kubenswrapper[4985]: I0224 10:14:32.857687 4985 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Feb 24 10:14:33 crc kubenswrapper[4985]: I0224 10:14:33.108825 4985 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Feb 24 10:14:33 crc kubenswrapper[4985]: I0224 10:14:33.261389 4985 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Feb 24 10:14:33 crc kubenswrapper[4985]: I0224 10:14:33.317845 4985 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Feb 24 10:14:33 crc kubenswrapper[4985]: I0224 10:14:33.355866 4985 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Feb 24 10:14:33 crc kubenswrapper[4985]: I0224 10:14:33.978863 4985 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Feb 24 10:14:34 crc kubenswrapper[4985]: I0224 10:14:34.167768 4985 ???:1] "http: TLS handshake error from 192.168.126.11:57660: no serving certificate available for the kubelet" Feb 24 10:14:35 crc kubenswrapper[4985]: I0224 10:14:35.263860 4985 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-b99b87784-mpfxr"] Feb 24 10:14:35 crc kubenswrapper[4985]: I0224 10:14:35.268440 4985 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-f847b5bdd-bzh7h"] Feb 24 10:14:35 crc kubenswrapper[4985]: I0224 10:14:35.268877 4985 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-f847b5bdd-bzh7h" podUID="b4ac1741-e660-4bb7-9685-45cde423045b" containerName="route-controller-manager" containerID="cri-o://122448aa64cf6196ff489a05c84b8a05a400071de4c9e24f1d6964159cd120ee" gracePeriod=30 Feb 24 10:14:35 crc kubenswrapper[4985]: I0224 10:14:35.269169 4985 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-b99b87784-mpfxr" podUID="60d0c604-71d7-4807-bb1b-c9d19e3e507d" containerName="controller-manager" containerID="cri-o://02756bf4947fdd483d5f2b687f4181cba6bb117342081a30a7a16fbeb5fbabc1" gracePeriod=30 Feb 24 10:14:35 crc kubenswrapper[4985]: I0224 10:14:35.678477 4985 generic.go:334] "Generic (PLEG): container finished" podID="60d0c604-71d7-4807-bb1b-c9d19e3e507d" containerID="02756bf4947fdd483d5f2b687f4181cba6bb117342081a30a7a16fbeb5fbabc1" exitCode=0 Feb 24 10:14:35 crc kubenswrapper[4985]: I0224 10:14:35.678574 4985 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-b99b87784-mpfxr" event={"ID":"60d0c604-71d7-4807-bb1b-c9d19e3e507d","Type":"ContainerDied","Data":"02756bf4947fdd483d5f2b687f4181cba6bb117342081a30a7a16fbeb5fbabc1"} Feb 24 10:14:35 crc kubenswrapper[4985]: I0224 10:14:35.681042 4985 generic.go:334] "Generic (PLEG): container finished" podID="b4ac1741-e660-4bb7-9685-45cde423045b" containerID="122448aa64cf6196ff489a05c84b8a05a400071de4c9e24f1d6964159cd120ee" exitCode=0 Feb 24 10:14:35 crc kubenswrapper[4985]: I0224 10:14:35.681113 4985 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-f847b5bdd-bzh7h" event={"ID":"b4ac1741-e660-4bb7-9685-45cde423045b","Type":"ContainerDied","Data":"122448aa64cf6196ff489a05c84b8a05a400071de4c9e24f1d6964159cd120ee"} Feb 24 10:14:35 crc kubenswrapper[4985]: I0224 10:14:35.731029 4985 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-f847b5bdd-bzh7h" Feb 24 10:14:35 crc kubenswrapper[4985]: I0224 10:14:35.770608 4985 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-b99b87784-mpfxr" Feb 24 10:14:35 crc kubenswrapper[4985]: I0224 10:14:35.900257 4985 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b4ac1741-e660-4bb7-9685-45cde423045b-config\") pod \"b4ac1741-e660-4bb7-9685-45cde423045b\" (UID: \"b4ac1741-e660-4bb7-9685-45cde423045b\") " Feb 24 10:14:35 crc kubenswrapper[4985]: I0224 10:14:35.900544 4985 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b4ac1741-e660-4bb7-9685-45cde423045b-serving-cert\") pod \"b4ac1741-e660-4bb7-9685-45cde423045b\" (UID: \"b4ac1741-e660-4bb7-9685-45cde423045b\") " Feb 24 10:14:35 crc kubenswrapper[4985]: I0224 10:14:35.900669 4985 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/60d0c604-71d7-4807-bb1b-c9d19e3e507d-serving-cert\") pod \"60d0c604-71d7-4807-bb1b-c9d19e3e507d\" (UID: \"60d0c604-71d7-4807-bb1b-c9d19e3e507d\") " Feb 24 10:14:35 crc kubenswrapper[4985]: I0224 10:14:35.900821 4985 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/b4ac1741-e660-4bb7-9685-45cde423045b-client-ca\") pod \"b4ac1741-e660-4bb7-9685-45cde423045b\" (UID: \"b4ac1741-e660-4bb7-9685-45cde423045b\") " Feb 24 10:14:35 crc kubenswrapper[4985]: I0224 10:14:35.900973 4985 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hmqc6\" (UniqueName: \"kubernetes.io/projected/b4ac1741-e660-4bb7-9685-45cde423045b-kube-api-access-hmqc6\") pod \"b4ac1741-e660-4bb7-9685-45cde423045b\" (UID: \"b4ac1741-e660-4bb7-9685-45cde423045b\") " Feb 24 10:14:35 crc kubenswrapper[4985]: I0224 10:14:35.901121 4985 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/60d0c604-71d7-4807-bb1b-c9d19e3e507d-client-ca\") pod \"60d0c604-71d7-4807-bb1b-c9d19e3e507d\" (UID: \"60d0c604-71d7-4807-bb1b-c9d19e3e507d\") " Feb 24 10:14:35 crc kubenswrapper[4985]: I0224 10:14:35.901218 4985 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/60d0c604-71d7-4807-bb1b-c9d19e3e507d-proxy-ca-bundles\") pod \"60d0c604-71d7-4807-bb1b-c9d19e3e507d\" (UID: \"60d0c604-71d7-4807-bb1b-c9d19e3e507d\") " Feb 24 10:14:35 crc kubenswrapper[4985]: I0224 10:14:35.901372 4985 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/60d0c604-71d7-4807-bb1b-c9d19e3e507d-config\") pod \"60d0c604-71d7-4807-bb1b-c9d19e3e507d\" (UID: \"60d0c604-71d7-4807-bb1b-c9d19e3e507d\") " Feb 24 10:14:35 crc kubenswrapper[4985]: I0224 10:14:35.901505 4985 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vlhmv\" (UniqueName: \"kubernetes.io/projected/60d0c604-71d7-4807-bb1b-c9d19e3e507d-kube-api-access-vlhmv\") pod \"60d0c604-71d7-4807-bb1b-c9d19e3e507d\" (UID: \"60d0c604-71d7-4807-bb1b-c9d19e3e507d\") " Feb 24 10:14:35 crc kubenswrapper[4985]: I0224 10:14:35.901579 4985 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b4ac1741-e660-4bb7-9685-45cde423045b-client-ca" (OuterVolumeSpecName: "client-ca") pod "b4ac1741-e660-4bb7-9685-45cde423045b" (UID: "b4ac1741-e660-4bb7-9685-45cde423045b"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 24 10:14:35 crc kubenswrapper[4985]: I0224 10:14:35.901818 4985 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/60d0c604-71d7-4807-bb1b-c9d19e3e507d-client-ca" (OuterVolumeSpecName: "client-ca") pod "60d0c604-71d7-4807-bb1b-c9d19e3e507d" (UID: "60d0c604-71d7-4807-bb1b-c9d19e3e507d"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 24 10:14:35 crc kubenswrapper[4985]: I0224 10:14:35.901863 4985 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/60d0c604-71d7-4807-bb1b-c9d19e3e507d-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "60d0c604-71d7-4807-bb1b-c9d19e3e507d" (UID: "60d0c604-71d7-4807-bb1b-c9d19e3e507d"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 24 10:14:35 crc kubenswrapper[4985]: I0224 10:14:35.902072 4985 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b4ac1741-e660-4bb7-9685-45cde423045b-config" (OuterVolumeSpecName: "config") pod "b4ac1741-e660-4bb7-9685-45cde423045b" (UID: "b4ac1741-e660-4bb7-9685-45cde423045b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 24 10:14:35 crc kubenswrapper[4985]: I0224 10:14:35.902083 4985 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/b4ac1741-e660-4bb7-9685-45cde423045b-client-ca\") on node \"crc\" DevicePath \"\"" Feb 24 10:14:35 crc kubenswrapper[4985]: I0224 10:14:35.902236 4985 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/60d0c604-71d7-4807-bb1b-c9d19e3e507d-client-ca\") on node \"crc\" DevicePath \"\"" Feb 24 10:14:35 crc kubenswrapper[4985]: I0224 10:14:35.902317 4985 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/60d0c604-71d7-4807-bb1b-c9d19e3e507d-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Feb 24 10:14:35 crc kubenswrapper[4985]: I0224 10:14:35.902243 4985 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/60d0c604-71d7-4807-bb1b-c9d19e3e507d-config" (OuterVolumeSpecName: "config") pod "60d0c604-71d7-4807-bb1b-c9d19e3e507d" (UID: "60d0c604-71d7-4807-bb1b-c9d19e3e507d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 24 10:14:35 crc kubenswrapper[4985]: I0224 10:14:35.905693 4985 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b4ac1741-e660-4bb7-9685-45cde423045b-kube-api-access-hmqc6" (OuterVolumeSpecName: "kube-api-access-hmqc6") pod "b4ac1741-e660-4bb7-9685-45cde423045b" (UID: "b4ac1741-e660-4bb7-9685-45cde423045b"). InnerVolumeSpecName "kube-api-access-hmqc6". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 24 10:14:35 crc kubenswrapper[4985]: I0224 10:14:35.905785 4985 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b4ac1741-e660-4bb7-9685-45cde423045b-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "b4ac1741-e660-4bb7-9685-45cde423045b" (UID: "b4ac1741-e660-4bb7-9685-45cde423045b"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 24 10:14:35 crc kubenswrapper[4985]: I0224 10:14:35.905817 4985 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/60d0c604-71d7-4807-bb1b-c9d19e3e507d-kube-api-access-vlhmv" (OuterVolumeSpecName: "kube-api-access-vlhmv") pod "60d0c604-71d7-4807-bb1b-c9d19e3e507d" (UID: "60d0c604-71d7-4807-bb1b-c9d19e3e507d"). InnerVolumeSpecName "kube-api-access-vlhmv". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 24 10:14:35 crc kubenswrapper[4985]: I0224 10:14:35.906033 4985 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/60d0c604-71d7-4807-bb1b-c9d19e3e507d-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "60d0c604-71d7-4807-bb1b-c9d19e3e507d" (UID: "60d0c604-71d7-4807-bb1b-c9d19e3e507d"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 24 10:14:36 crc kubenswrapper[4985]: I0224 10:14:36.003702 4985 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/60d0c604-71d7-4807-bb1b-c9d19e3e507d-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 24 10:14:36 crc kubenswrapper[4985]: I0224 10:14:36.003765 4985 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hmqc6\" (UniqueName: \"kubernetes.io/projected/b4ac1741-e660-4bb7-9685-45cde423045b-kube-api-access-hmqc6\") on node \"crc\" DevicePath \"\"" Feb 24 10:14:36 crc kubenswrapper[4985]: I0224 10:14:36.003809 4985 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/60d0c604-71d7-4807-bb1b-c9d19e3e507d-config\") on node \"crc\" DevicePath \"\"" Feb 24 10:14:36 crc kubenswrapper[4985]: I0224 10:14:36.003837 4985 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vlhmv\" (UniqueName: \"kubernetes.io/projected/60d0c604-71d7-4807-bb1b-c9d19e3e507d-kube-api-access-vlhmv\") on node \"crc\" DevicePath \"\"" Feb 24 10:14:36 crc kubenswrapper[4985]: I0224 10:14:36.003860 4985 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b4ac1741-e660-4bb7-9685-45cde423045b-config\") on node \"crc\" DevicePath \"\"" Feb 24 10:14:36 crc kubenswrapper[4985]: I0224 10:14:36.003877 4985 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b4ac1741-e660-4bb7-9685-45cde423045b-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 24 10:14:36 crc kubenswrapper[4985]: I0224 10:14:36.338163 4985 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Feb 24 10:14:36 crc kubenswrapper[4985]: I0224 10:14:36.338246 4985 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 24 10:14:36 crc kubenswrapper[4985]: I0224 10:14:36.509064 4985 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Feb 24 10:14:36 crc kubenswrapper[4985]: I0224 10:14:36.509148 4985 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Feb 24 10:14:36 crc kubenswrapper[4985]: I0224 10:14:36.509158 4985 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log" (OuterVolumeSpecName: "var-log") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 24 10:14:36 crc kubenswrapper[4985]: I0224 10:14:36.509183 4985 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Feb 24 10:14:36 crc kubenswrapper[4985]: I0224 10:14:36.509234 4985 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 24 10:14:36 crc kubenswrapper[4985]: I0224 10:14:36.509253 4985 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Feb 24 10:14:36 crc kubenswrapper[4985]: I0224 10:14:36.509282 4985 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Feb 24 10:14:36 crc kubenswrapper[4985]: I0224 10:14:36.509540 4985 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock" (OuterVolumeSpecName: "var-lock") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 24 10:14:36 crc kubenswrapper[4985]: I0224 10:14:36.509707 4985 reconciler_common.go:293] "Volume detached for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") on node \"crc\" DevicePath \"\"" Feb 24 10:14:36 crc kubenswrapper[4985]: I0224 10:14:36.509705 4985 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests" (OuterVolumeSpecName: "manifests") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "manifests". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 24 10:14:36 crc kubenswrapper[4985]: I0224 10:14:36.509730 4985 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") on node \"crc\" DevicePath \"\"" Feb 24 10:14:36 crc kubenswrapper[4985]: I0224 10:14:36.509800 4985 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") on node \"crc\" DevicePath \"\"" Feb 24 10:14:36 crc kubenswrapper[4985]: I0224 10:14:36.515998 4985 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir" (OuterVolumeSpecName: "pod-resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "pod-resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 24 10:14:36 crc kubenswrapper[4985]: I0224 10:14:36.610626 4985 reconciler_common.go:293] "Volume detached for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") on node \"crc\" DevicePath \"\"" Feb 24 10:14:36 crc kubenswrapper[4985]: I0224 10:14:36.610679 4985 reconciler_common.go:293] "Volume detached for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") on node \"crc\" DevicePath \"\"" Feb 24 10:14:36 crc kubenswrapper[4985]: I0224 10:14:36.686721 4985 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-b99b87784-mpfxr" Feb 24 10:14:36 crc kubenswrapper[4985]: I0224 10:14:36.686703 4985 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-b99b87784-mpfxr" event={"ID":"60d0c604-71d7-4807-bb1b-c9d19e3e507d","Type":"ContainerDied","Data":"c9cd4685b2561d4bfab8e556421818f31e0ec4ea574f818a015848d9a5d3c51c"} Feb 24 10:14:36 crc kubenswrapper[4985]: I0224 10:14:36.686809 4985 scope.go:117] "RemoveContainer" containerID="02756bf4947fdd483d5f2b687f4181cba6bb117342081a30a7a16fbeb5fbabc1" Feb 24 10:14:36 crc kubenswrapper[4985]: I0224 10:14:36.689014 4985 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-f847b5bdd-bzh7h" event={"ID":"b4ac1741-e660-4bb7-9685-45cde423045b","Type":"ContainerDied","Data":"a20712e2b47d6141332b8ec00cd68c9e645e665f4c7037c466b4082036dc0777"} Feb 24 10:14:36 crc kubenswrapper[4985]: I0224 10:14:36.689111 4985 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-f847b5bdd-bzh7h" Feb 24 10:14:36 crc kubenswrapper[4985]: I0224 10:14:36.691047 4985 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Feb 24 10:14:36 crc kubenswrapper[4985]: I0224 10:14:36.692334 4985 generic.go:334] "Generic (PLEG): container finished" podID="f85e55b1a89d02b0cb034b1ea31ed45a" containerID="187e9a2ff15e2fa14d2acece5ed51951e03d34788ba1d2e336cb002fe0f51e1f" exitCode=137 Feb 24 10:14:36 crc kubenswrapper[4985]: I0224 10:14:36.692438 4985 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 24 10:14:36 crc kubenswrapper[4985]: I0224 10:14:36.709032 4985 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-b99b87784-mpfxr"] Feb 24 10:14:36 crc kubenswrapper[4985]: I0224 10:14:36.710196 4985 scope.go:117] "RemoveContainer" containerID="122448aa64cf6196ff489a05c84b8a05a400071de4c9e24f1d6964159cd120ee" Feb 24 10:14:36 crc kubenswrapper[4985]: I0224 10:14:36.713684 4985 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-b99b87784-mpfxr"] Feb 24 10:14:36 crc kubenswrapper[4985]: I0224 10:14:36.718362 4985 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-f847b5bdd-bzh7h"] Feb 24 10:14:36 crc kubenswrapper[4985]: I0224 10:14:36.721394 4985 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-f847b5bdd-bzh7h"] Feb 24 10:14:36 crc kubenswrapper[4985]: I0224 10:14:36.727136 4985 scope.go:117] "RemoveContainer" containerID="187e9a2ff15e2fa14d2acece5ed51951e03d34788ba1d2e336cb002fe0f51e1f" Feb 24 10:14:36 crc kubenswrapper[4985]: I0224 10:14:36.743218 4985 scope.go:117] "RemoveContainer" containerID="187e9a2ff15e2fa14d2acece5ed51951e03d34788ba1d2e336cb002fe0f51e1f" Feb 24 10:14:36 crc kubenswrapper[4985]: E0224 10:14:36.743593 4985 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"187e9a2ff15e2fa14d2acece5ed51951e03d34788ba1d2e336cb002fe0f51e1f\": container with ID starting with 187e9a2ff15e2fa14d2acece5ed51951e03d34788ba1d2e336cb002fe0f51e1f not found: ID does not exist" containerID="187e9a2ff15e2fa14d2acece5ed51951e03d34788ba1d2e336cb002fe0f51e1f" Feb 24 10:14:36 crc kubenswrapper[4985]: I0224 10:14:36.743635 4985 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"187e9a2ff15e2fa14d2acece5ed51951e03d34788ba1d2e336cb002fe0f51e1f"} err="failed to get container status \"187e9a2ff15e2fa14d2acece5ed51951e03d34788ba1d2e336cb002fe0f51e1f\": rpc error: code = NotFound desc = could not find container \"187e9a2ff15e2fa14d2acece5ed51951e03d34788ba1d2e336cb002fe0f51e1f\": container with ID starting with 187e9a2ff15e2fa14d2acece5ed51951e03d34788ba1d2e336cb002fe0f51e1f not found: ID does not exist" Feb 24 10:14:37 crc kubenswrapper[4985]: I0224 10:14:37.080420 4985 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-f5dbb47b7-wxcvn"] Feb 24 10:14:37 crc kubenswrapper[4985]: E0224 10:14:37.081224 4985 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="60d0c604-71d7-4807-bb1b-c9d19e3e507d" containerName="controller-manager" Feb 24 10:14:37 crc kubenswrapper[4985]: I0224 10:14:37.081241 4985 state_mem.go:107] "Deleted CPUSet assignment" podUID="60d0c604-71d7-4807-bb1b-c9d19e3e507d" containerName="controller-manager" Feb 24 10:14:37 crc kubenswrapper[4985]: E0224 10:14:37.081252 4985 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Feb 24 10:14:37 crc kubenswrapper[4985]: I0224 10:14:37.081258 4985 state_mem.go:107] "Deleted CPUSet assignment" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Feb 24 10:14:37 crc kubenswrapper[4985]: E0224 10:14:37.081277 4985 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b4ac1741-e660-4bb7-9685-45cde423045b" containerName="route-controller-manager" Feb 24 10:14:37 crc kubenswrapper[4985]: I0224 10:14:37.081287 4985 state_mem.go:107] "Deleted CPUSet assignment" podUID="b4ac1741-e660-4bb7-9685-45cde423045b" containerName="route-controller-manager" Feb 24 10:14:37 crc kubenswrapper[4985]: E0224 10:14:37.081307 4985 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6e3d6d7b-c88d-4966-8a11-915441e6b482" containerName="installer" Feb 24 10:14:37 crc kubenswrapper[4985]: I0224 10:14:37.081313 4985 state_mem.go:107] "Deleted CPUSet assignment" podUID="6e3d6d7b-c88d-4966-8a11-915441e6b482" containerName="installer" Feb 24 10:14:37 crc kubenswrapper[4985]: I0224 10:14:37.081428 4985 memory_manager.go:354] "RemoveStaleState removing state" podUID="60d0c604-71d7-4807-bb1b-c9d19e3e507d" containerName="controller-manager" Feb 24 10:14:37 crc kubenswrapper[4985]: I0224 10:14:37.081438 4985 memory_manager.go:354] "RemoveStaleState removing state" podUID="b4ac1741-e660-4bb7-9685-45cde423045b" containerName="route-controller-manager" Feb 24 10:14:37 crc kubenswrapper[4985]: I0224 10:14:37.081449 4985 memory_manager.go:354] "RemoveStaleState removing state" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Feb 24 10:14:37 crc kubenswrapper[4985]: I0224 10:14:37.081459 4985 memory_manager.go:354] "RemoveStaleState removing state" podUID="6e3d6d7b-c88d-4966-8a11-915441e6b482" containerName="installer" Feb 24 10:14:37 crc kubenswrapper[4985]: I0224 10:14:37.083237 4985 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-f5dbb47b7-wxcvn" Feb 24 10:14:37 crc kubenswrapper[4985]: I0224 10:14:37.085643 4985 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Feb 24 10:14:37 crc kubenswrapper[4985]: I0224 10:14:37.085870 4985 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Feb 24 10:14:37 crc kubenswrapper[4985]: I0224 10:14:37.085907 4985 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Feb 24 10:14:37 crc kubenswrapper[4985]: I0224 10:14:37.086200 4985 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Feb 24 10:14:37 crc kubenswrapper[4985]: I0224 10:14:37.086212 4985 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5cb5c6584f-49rxc"] Feb 24 10:14:37 crc kubenswrapper[4985]: I0224 10:14:37.086974 4985 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-5cb5c6584f-49rxc" Feb 24 10:14:37 crc kubenswrapper[4985]: I0224 10:14:37.087020 4985 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Feb 24 10:14:37 crc kubenswrapper[4985]: I0224 10:14:37.090991 4985 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-f5dbb47b7-wxcvn"] Feb 24 10:14:37 crc kubenswrapper[4985]: I0224 10:14:37.091814 4985 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Feb 24 10:14:37 crc kubenswrapper[4985]: I0224 10:14:37.092581 4985 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Feb 24 10:14:37 crc kubenswrapper[4985]: I0224 10:14:37.092839 4985 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Feb 24 10:14:37 crc kubenswrapper[4985]: I0224 10:14:37.094921 4985 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Feb 24 10:14:37 crc kubenswrapper[4985]: I0224 10:14:37.095236 4985 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Feb 24 10:14:37 crc kubenswrapper[4985]: I0224 10:14:37.096132 4985 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Feb 24 10:14:37 crc kubenswrapper[4985]: I0224 10:14:37.096943 4985 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Feb 24 10:14:37 crc kubenswrapper[4985]: I0224 10:14:37.097499 4985 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5cb5c6584f-49rxc"] Feb 24 10:14:37 crc kubenswrapper[4985]: I0224 10:14:37.098195 4985 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Feb 24 10:14:37 crc kubenswrapper[4985]: I0224 10:14:37.217299 4985 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/62c3035c-c1a2-424a-af19-cb9b9b5aa2d9-config\") pod \"controller-manager-f5dbb47b7-wxcvn\" (UID: \"62c3035c-c1a2-424a-af19-cb9b9b5aa2d9\") " pod="openshift-controller-manager/controller-manager-f5dbb47b7-wxcvn" Feb 24 10:14:37 crc kubenswrapper[4985]: I0224 10:14:37.217348 4985 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/62c3035c-c1a2-424a-af19-cb9b9b5aa2d9-client-ca\") pod \"controller-manager-f5dbb47b7-wxcvn\" (UID: \"62c3035c-c1a2-424a-af19-cb9b9b5aa2d9\") " pod="openshift-controller-manager/controller-manager-f5dbb47b7-wxcvn" Feb 24 10:14:37 crc kubenswrapper[4985]: I0224 10:14:37.217381 4985 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4c967d5f-f1f8-496d-bae9-6efb0d2659a7-config\") pod \"route-controller-manager-5cb5c6584f-49rxc\" (UID: \"4c967d5f-f1f8-496d-bae9-6efb0d2659a7\") " pod="openshift-route-controller-manager/route-controller-manager-5cb5c6584f-49rxc" Feb 24 10:14:37 crc kubenswrapper[4985]: I0224 10:14:37.217614 4985 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9f5kl\" (UniqueName: \"kubernetes.io/projected/4c967d5f-f1f8-496d-bae9-6efb0d2659a7-kube-api-access-9f5kl\") pod \"route-controller-manager-5cb5c6584f-49rxc\" (UID: \"4c967d5f-f1f8-496d-bae9-6efb0d2659a7\") " pod="openshift-route-controller-manager/route-controller-manager-5cb5c6584f-49rxc" Feb 24 10:14:37 crc kubenswrapper[4985]: I0224 10:14:37.217666 4985 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/62c3035c-c1a2-424a-af19-cb9b9b5aa2d9-proxy-ca-bundles\") pod \"controller-manager-f5dbb47b7-wxcvn\" (UID: \"62c3035c-c1a2-424a-af19-cb9b9b5aa2d9\") " pod="openshift-controller-manager/controller-manager-f5dbb47b7-wxcvn" Feb 24 10:14:37 crc kubenswrapper[4985]: I0224 10:14:37.217833 4985 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/4c967d5f-f1f8-496d-bae9-6efb0d2659a7-client-ca\") pod \"route-controller-manager-5cb5c6584f-49rxc\" (UID: \"4c967d5f-f1f8-496d-bae9-6efb0d2659a7\") " pod="openshift-route-controller-manager/route-controller-manager-5cb5c6584f-49rxc" Feb 24 10:14:37 crc kubenswrapper[4985]: I0224 10:14:37.217953 4985 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/62c3035c-c1a2-424a-af19-cb9b9b5aa2d9-serving-cert\") pod \"controller-manager-f5dbb47b7-wxcvn\" (UID: \"62c3035c-c1a2-424a-af19-cb9b9b5aa2d9\") " pod="openshift-controller-manager/controller-manager-f5dbb47b7-wxcvn" Feb 24 10:14:37 crc kubenswrapper[4985]: I0224 10:14:37.218056 4985 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4c967d5f-f1f8-496d-bae9-6efb0d2659a7-serving-cert\") pod \"route-controller-manager-5cb5c6584f-49rxc\" (UID: \"4c967d5f-f1f8-496d-bae9-6efb0d2659a7\") " pod="openshift-route-controller-manager/route-controller-manager-5cb5c6584f-49rxc" Feb 24 10:14:37 crc kubenswrapper[4985]: I0224 10:14:37.218101 4985 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z7zxg\" (UniqueName: \"kubernetes.io/projected/62c3035c-c1a2-424a-af19-cb9b9b5aa2d9-kube-api-access-z7zxg\") pod \"controller-manager-f5dbb47b7-wxcvn\" (UID: \"62c3035c-c1a2-424a-af19-cb9b9b5aa2d9\") " pod="openshift-controller-manager/controller-manager-f5dbb47b7-wxcvn" Feb 24 10:14:37 crc kubenswrapper[4985]: I0224 10:14:37.319055 4985 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9f5kl\" (UniqueName: \"kubernetes.io/projected/4c967d5f-f1f8-496d-bae9-6efb0d2659a7-kube-api-access-9f5kl\") pod \"route-controller-manager-5cb5c6584f-49rxc\" (UID: \"4c967d5f-f1f8-496d-bae9-6efb0d2659a7\") " pod="openshift-route-controller-manager/route-controller-manager-5cb5c6584f-49rxc" Feb 24 10:14:37 crc kubenswrapper[4985]: I0224 10:14:37.319112 4985 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/62c3035c-c1a2-424a-af19-cb9b9b5aa2d9-proxy-ca-bundles\") pod \"controller-manager-f5dbb47b7-wxcvn\" (UID: \"62c3035c-c1a2-424a-af19-cb9b9b5aa2d9\") " pod="openshift-controller-manager/controller-manager-f5dbb47b7-wxcvn" Feb 24 10:14:37 crc kubenswrapper[4985]: I0224 10:14:37.319137 4985 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/4c967d5f-f1f8-496d-bae9-6efb0d2659a7-client-ca\") pod \"route-controller-manager-5cb5c6584f-49rxc\" (UID: \"4c967d5f-f1f8-496d-bae9-6efb0d2659a7\") " pod="openshift-route-controller-manager/route-controller-manager-5cb5c6584f-49rxc" Feb 24 10:14:37 crc kubenswrapper[4985]: I0224 10:14:37.319156 4985 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/62c3035c-c1a2-424a-af19-cb9b9b5aa2d9-serving-cert\") pod \"controller-manager-f5dbb47b7-wxcvn\" (UID: \"62c3035c-c1a2-424a-af19-cb9b9b5aa2d9\") " pod="openshift-controller-manager/controller-manager-f5dbb47b7-wxcvn" Feb 24 10:14:37 crc kubenswrapper[4985]: I0224 10:14:37.319178 4985 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4c967d5f-f1f8-496d-bae9-6efb0d2659a7-serving-cert\") pod \"route-controller-manager-5cb5c6584f-49rxc\" (UID: \"4c967d5f-f1f8-496d-bae9-6efb0d2659a7\") " pod="openshift-route-controller-manager/route-controller-manager-5cb5c6584f-49rxc" Feb 24 10:14:37 crc kubenswrapper[4985]: I0224 10:14:37.319196 4985 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z7zxg\" (UniqueName: \"kubernetes.io/projected/62c3035c-c1a2-424a-af19-cb9b9b5aa2d9-kube-api-access-z7zxg\") pod \"controller-manager-f5dbb47b7-wxcvn\" (UID: \"62c3035c-c1a2-424a-af19-cb9b9b5aa2d9\") " pod="openshift-controller-manager/controller-manager-f5dbb47b7-wxcvn" Feb 24 10:14:37 crc kubenswrapper[4985]: I0224 10:14:37.319222 4985 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/62c3035c-c1a2-424a-af19-cb9b9b5aa2d9-config\") pod \"controller-manager-f5dbb47b7-wxcvn\" (UID: \"62c3035c-c1a2-424a-af19-cb9b9b5aa2d9\") " pod="openshift-controller-manager/controller-manager-f5dbb47b7-wxcvn" Feb 24 10:14:37 crc kubenswrapper[4985]: I0224 10:14:37.319237 4985 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/62c3035c-c1a2-424a-af19-cb9b9b5aa2d9-client-ca\") pod \"controller-manager-f5dbb47b7-wxcvn\" (UID: \"62c3035c-c1a2-424a-af19-cb9b9b5aa2d9\") " pod="openshift-controller-manager/controller-manager-f5dbb47b7-wxcvn" Feb 24 10:14:37 crc kubenswrapper[4985]: I0224 10:14:37.319261 4985 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4c967d5f-f1f8-496d-bae9-6efb0d2659a7-config\") pod \"route-controller-manager-5cb5c6584f-49rxc\" (UID: \"4c967d5f-f1f8-496d-bae9-6efb0d2659a7\") " pod="openshift-route-controller-manager/route-controller-manager-5cb5c6584f-49rxc" Feb 24 10:14:37 crc kubenswrapper[4985]: I0224 10:14:37.320782 4985 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/62c3035c-c1a2-424a-af19-cb9b9b5aa2d9-proxy-ca-bundles\") pod \"controller-manager-f5dbb47b7-wxcvn\" (UID: \"62c3035c-c1a2-424a-af19-cb9b9b5aa2d9\") " pod="openshift-controller-manager/controller-manager-f5dbb47b7-wxcvn" Feb 24 10:14:37 crc kubenswrapper[4985]: I0224 10:14:37.320802 4985 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/62c3035c-c1a2-424a-af19-cb9b9b5aa2d9-config\") pod \"controller-manager-f5dbb47b7-wxcvn\" (UID: \"62c3035c-c1a2-424a-af19-cb9b9b5aa2d9\") " pod="openshift-controller-manager/controller-manager-f5dbb47b7-wxcvn" Feb 24 10:14:37 crc kubenswrapper[4985]: I0224 10:14:37.321332 4985 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/4c967d5f-f1f8-496d-bae9-6efb0d2659a7-client-ca\") pod \"route-controller-manager-5cb5c6584f-49rxc\" (UID: \"4c967d5f-f1f8-496d-bae9-6efb0d2659a7\") " pod="openshift-route-controller-manager/route-controller-manager-5cb5c6584f-49rxc" Feb 24 10:14:37 crc kubenswrapper[4985]: I0224 10:14:37.321438 4985 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/62c3035c-c1a2-424a-af19-cb9b9b5aa2d9-client-ca\") pod \"controller-manager-f5dbb47b7-wxcvn\" (UID: \"62c3035c-c1a2-424a-af19-cb9b9b5aa2d9\") " pod="openshift-controller-manager/controller-manager-f5dbb47b7-wxcvn" Feb 24 10:14:37 crc kubenswrapper[4985]: I0224 10:14:37.322237 4985 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4c967d5f-f1f8-496d-bae9-6efb0d2659a7-config\") pod \"route-controller-manager-5cb5c6584f-49rxc\" (UID: \"4c967d5f-f1f8-496d-bae9-6efb0d2659a7\") " pod="openshift-route-controller-manager/route-controller-manager-5cb5c6584f-49rxc" Feb 24 10:14:37 crc kubenswrapper[4985]: I0224 10:14:37.324026 4985 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/62c3035c-c1a2-424a-af19-cb9b9b5aa2d9-serving-cert\") pod \"controller-manager-f5dbb47b7-wxcvn\" (UID: \"62c3035c-c1a2-424a-af19-cb9b9b5aa2d9\") " pod="openshift-controller-manager/controller-manager-f5dbb47b7-wxcvn" Feb 24 10:14:37 crc kubenswrapper[4985]: I0224 10:14:37.333350 4985 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4c967d5f-f1f8-496d-bae9-6efb0d2659a7-serving-cert\") pod \"route-controller-manager-5cb5c6584f-49rxc\" (UID: \"4c967d5f-f1f8-496d-bae9-6efb0d2659a7\") " pod="openshift-route-controller-manager/route-controller-manager-5cb5c6584f-49rxc" Feb 24 10:14:37 crc kubenswrapper[4985]: I0224 10:14:37.335383 4985 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z7zxg\" (UniqueName: \"kubernetes.io/projected/62c3035c-c1a2-424a-af19-cb9b9b5aa2d9-kube-api-access-z7zxg\") pod \"controller-manager-f5dbb47b7-wxcvn\" (UID: \"62c3035c-c1a2-424a-af19-cb9b9b5aa2d9\") " pod="openshift-controller-manager/controller-manager-f5dbb47b7-wxcvn" Feb 24 10:14:37 crc kubenswrapper[4985]: I0224 10:14:37.337194 4985 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9f5kl\" (UniqueName: \"kubernetes.io/projected/4c967d5f-f1f8-496d-bae9-6efb0d2659a7-kube-api-access-9f5kl\") pod \"route-controller-manager-5cb5c6584f-49rxc\" (UID: \"4c967d5f-f1f8-496d-bae9-6efb0d2659a7\") " pod="openshift-route-controller-manager/route-controller-manager-5cb5c6584f-49rxc" Feb 24 10:14:37 crc kubenswrapper[4985]: I0224 10:14:37.414701 4985 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-f5dbb47b7-wxcvn" Feb 24 10:14:37 crc kubenswrapper[4985]: I0224 10:14:37.424221 4985 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-5cb5c6584f-49rxc" Feb 24 10:14:37 crc kubenswrapper[4985]: I0224 10:14:37.673414 4985 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5cb5c6584f-49rxc"] Feb 24 10:14:37 crc kubenswrapper[4985]: W0224 10:14:37.674770 4985 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4c967d5f_f1f8_496d_bae9_6efb0d2659a7.slice/crio-97788e80475ed9ea2b4e0e959a8982e61e3ea23fb167779f4d09dc9624fcd050 WatchSource:0}: Error finding container 97788e80475ed9ea2b4e0e959a8982e61e3ea23fb167779f4d09dc9624fcd050: Status 404 returned error can't find the container with id 97788e80475ed9ea2b4e0e959a8982e61e3ea23fb167779f4d09dc9624fcd050 Feb 24 10:14:37 crc kubenswrapper[4985]: I0224 10:14:37.698563 4985 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-5cb5c6584f-49rxc" event={"ID":"4c967d5f-f1f8-496d-bae9-6efb0d2659a7","Type":"ContainerStarted","Data":"97788e80475ed9ea2b4e0e959a8982e61e3ea23fb167779f4d09dc9624fcd050"} Feb 24 10:14:37 crc kubenswrapper[4985]: I0224 10:14:37.804464 4985 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-f5dbb47b7-wxcvn"] Feb 24 10:14:38 crc kubenswrapper[4985]: I0224 10:14:38.281749 4985 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="60d0c604-71d7-4807-bb1b-c9d19e3e507d" path="/var/lib/kubelet/pods/60d0c604-71d7-4807-bb1b-c9d19e3e507d/volumes" Feb 24 10:14:38 crc kubenswrapper[4985]: I0224 10:14:38.282472 4985 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b4ac1741-e660-4bb7-9685-45cde423045b" path="/var/lib/kubelet/pods/b4ac1741-e660-4bb7-9685-45cde423045b/volumes" Feb 24 10:14:38 crc kubenswrapper[4985]: I0224 10:14:38.282842 4985 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" path="/var/lib/kubelet/pods/f85e55b1a89d02b0cb034b1ea31ed45a/volumes" Feb 24 10:14:38 crc kubenswrapper[4985]: I0224 10:14:38.283087 4985 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podUID="" Feb 24 10:14:38 crc kubenswrapper[4985]: I0224 10:14:38.291331 4985 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Feb 24 10:14:38 crc kubenswrapper[4985]: I0224 10:14:38.291370 4985 kubelet.go:2649] "Unable to find pod for mirror pod, skipping" mirrorPod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" mirrorPodUID="63e49df7-8240-4caa-9f84-65fd8821b33c" Feb 24 10:14:38 crc kubenswrapper[4985]: I0224 10:14:38.294635 4985 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Feb 24 10:14:38 crc kubenswrapper[4985]: I0224 10:14:38.294678 4985 kubelet.go:2673] "Unable to find pod for mirror pod, skipping" mirrorPod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" mirrorPodUID="63e49df7-8240-4caa-9f84-65fd8821b33c" Feb 24 10:14:38 crc kubenswrapper[4985]: I0224 10:14:38.706915 4985 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-5cb5c6584f-49rxc" event={"ID":"4c967d5f-f1f8-496d-bae9-6efb0d2659a7","Type":"ContainerStarted","Data":"15daa75565e0d6dbb27a24dd51be7a16e21fe5832f77c628ab7ca9982e2817d3"} Feb 24 10:14:38 crc kubenswrapper[4985]: I0224 10:14:38.707090 4985 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-5cb5c6584f-49rxc" Feb 24 10:14:38 crc kubenswrapper[4985]: I0224 10:14:38.709270 4985 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-f5dbb47b7-wxcvn" event={"ID":"62c3035c-c1a2-424a-af19-cb9b9b5aa2d9","Type":"ContainerStarted","Data":"be64f85b472ebad3cb64613ec7afac0fff17f4e03b8defda3f053ea9787a81e6"} Feb 24 10:14:38 crc kubenswrapper[4985]: I0224 10:14:38.709303 4985 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-f5dbb47b7-wxcvn" event={"ID":"62c3035c-c1a2-424a-af19-cb9b9b5aa2d9","Type":"ContainerStarted","Data":"26bd2bf463f40cbab2155910d63a5421dacc8230007dbaa5937ff7f296924192"} Feb 24 10:14:38 crc kubenswrapper[4985]: I0224 10:14:38.709479 4985 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-f5dbb47b7-wxcvn" Feb 24 10:14:38 crc kubenswrapper[4985]: I0224 10:14:38.711793 4985 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-5cb5c6584f-49rxc" Feb 24 10:14:38 crc kubenswrapper[4985]: I0224 10:14:38.714831 4985 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-f5dbb47b7-wxcvn" Feb 24 10:14:38 crc kubenswrapper[4985]: I0224 10:14:38.728361 4985 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-5cb5c6584f-49rxc" podStartSLOduration=3.728340024 podStartE2EDuration="3.728340024s" podCreationTimestamp="2026-02-24 10:14:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-24 10:14:38.724233277 +0000 UTC m=+363.198425837" watchObservedRunningTime="2026-02-24 10:14:38.728340024 +0000 UTC m=+363.202532604" Feb 24 10:14:46 crc kubenswrapper[4985]: I0224 10:14:46.231212 4985 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Feb 24 10:14:47 crc kubenswrapper[4985]: I0224 10:14:47.086392 4985 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Feb 24 10:14:47 crc kubenswrapper[4985]: I0224 10:14:47.166138 4985 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Feb 24 10:14:47 crc kubenswrapper[4985]: I0224 10:14:47.590216 4985 ???:1] "http: TLS handshake error from 192.168.126.11:52790: no serving certificate available for the kubelet" Feb 24 10:14:50 crc kubenswrapper[4985]: I0224 10:14:50.673254 4985 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Feb 24 10:14:50 crc kubenswrapper[4985]: I0224 10:14:50.766385 4985 generic.go:334] "Generic (PLEG): container finished" podID="6ef771ef-28ac-46c6-925e-f12a7a70b6c3" containerID="c094ee19a2d18359139a745230e588b4a99d5fd2e567eff67655a43342efae0f" exitCode=0 Feb 24 10:14:50 crc kubenswrapper[4985]: I0224 10:14:50.766441 4985 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-rlq9j" event={"ID":"6ef771ef-28ac-46c6-925e-f12a7a70b6c3","Type":"ContainerDied","Data":"c094ee19a2d18359139a745230e588b4a99d5fd2e567eff67655a43342efae0f"} Feb 24 10:14:50 crc kubenswrapper[4985]: I0224 10:14:50.766991 4985 scope.go:117] "RemoveContainer" containerID="c094ee19a2d18359139a745230e588b4a99d5fd2e567eff67655a43342efae0f" Feb 24 10:14:50 crc kubenswrapper[4985]: I0224 10:14:50.786925 4985 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-f5dbb47b7-wxcvn" podStartSLOduration=15.786906783 podStartE2EDuration="15.786906783s" podCreationTimestamp="2026-02-24 10:14:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-24 10:14:38.762374227 +0000 UTC m=+363.236566787" watchObservedRunningTime="2026-02-24 10:14:50.786906783 +0000 UTC m=+375.261099343" Feb 24 10:14:51 crc kubenswrapper[4985]: I0224 10:14:51.774199 4985 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-rlq9j" event={"ID":"6ef771ef-28ac-46c6-925e-f12a7a70b6c3","Type":"ContainerStarted","Data":"3642100016b8d9597fa4ebee463d4b2a820e327e6bcd36dd9448cc560a2e42b7"} Feb 24 10:14:51 crc kubenswrapper[4985]: I0224 10:14:51.775584 4985 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-rlq9j" Feb 24 10:14:51 crc kubenswrapper[4985]: I0224 10:14:51.777799 4985 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-rlq9j" Feb 24 10:14:51 crc kubenswrapper[4985]: I0224 10:14:51.929019 4985 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Feb 24 10:14:52 crc kubenswrapper[4985]: I0224 10:14:52.056385 4985 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Feb 24 10:14:52 crc kubenswrapper[4985]: I0224 10:14:52.215702 4985 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Feb 24 10:14:52 crc kubenswrapper[4985]: I0224 10:14:52.387419 4985 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Feb 24 10:14:55 crc kubenswrapper[4985]: I0224 10:14:55.240298 4985 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-f5dbb47b7-wxcvn"] Feb 24 10:14:55 crc kubenswrapper[4985]: I0224 10:14:55.240742 4985 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-f5dbb47b7-wxcvn" podUID="62c3035c-c1a2-424a-af19-cb9b9b5aa2d9" containerName="controller-manager" containerID="cri-o://be64f85b472ebad3cb64613ec7afac0fff17f4e03b8defda3f053ea9787a81e6" gracePeriod=30 Feb 24 10:14:55 crc kubenswrapper[4985]: I0224 10:14:55.272615 4985 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5cb5c6584f-49rxc"] Feb 24 10:14:55 crc kubenswrapper[4985]: I0224 10:14:55.273203 4985 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-5cb5c6584f-49rxc" podUID="4c967d5f-f1f8-496d-bae9-6efb0d2659a7" containerName="route-controller-manager" containerID="cri-o://15daa75565e0d6dbb27a24dd51be7a16e21fe5832f77c628ab7ca9982e2817d3" gracePeriod=30 Feb 24 10:14:55 crc kubenswrapper[4985]: I0224 10:14:55.796213 4985 generic.go:334] "Generic (PLEG): container finished" podID="4c967d5f-f1f8-496d-bae9-6efb0d2659a7" containerID="15daa75565e0d6dbb27a24dd51be7a16e21fe5832f77c628ab7ca9982e2817d3" exitCode=0 Feb 24 10:14:55 crc kubenswrapper[4985]: I0224 10:14:55.796325 4985 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-5cb5c6584f-49rxc" event={"ID":"4c967d5f-f1f8-496d-bae9-6efb0d2659a7","Type":"ContainerDied","Data":"15daa75565e0d6dbb27a24dd51be7a16e21fe5832f77c628ab7ca9982e2817d3"} Feb 24 10:14:55 crc kubenswrapper[4985]: I0224 10:14:55.798702 4985 generic.go:334] "Generic (PLEG): container finished" podID="62c3035c-c1a2-424a-af19-cb9b9b5aa2d9" containerID="be64f85b472ebad3cb64613ec7afac0fff17f4e03b8defda3f053ea9787a81e6" exitCode=0 Feb 24 10:14:55 crc kubenswrapper[4985]: I0224 10:14:55.798762 4985 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-f5dbb47b7-wxcvn" event={"ID":"62c3035c-c1a2-424a-af19-cb9b9b5aa2d9","Type":"ContainerDied","Data":"be64f85b472ebad3cb64613ec7afac0fff17f4e03b8defda3f053ea9787a81e6"} Feb 24 10:14:55 crc kubenswrapper[4985]: I0224 10:14:55.889170 4985 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-5cb5c6584f-49rxc" Feb 24 10:14:56 crc kubenswrapper[4985]: I0224 10:14:56.044035 4985 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/4c967d5f-f1f8-496d-bae9-6efb0d2659a7-client-ca\") pod \"4c967d5f-f1f8-496d-bae9-6efb0d2659a7\" (UID: \"4c967d5f-f1f8-496d-bae9-6efb0d2659a7\") " Feb 24 10:14:56 crc kubenswrapper[4985]: I0224 10:14:56.044328 4985 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4c967d5f-f1f8-496d-bae9-6efb0d2659a7-serving-cert\") pod \"4c967d5f-f1f8-496d-bae9-6efb0d2659a7\" (UID: \"4c967d5f-f1f8-496d-bae9-6efb0d2659a7\") " Feb 24 10:14:56 crc kubenswrapper[4985]: I0224 10:14:56.044469 4985 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4c967d5f-f1f8-496d-bae9-6efb0d2659a7-config\") pod \"4c967d5f-f1f8-496d-bae9-6efb0d2659a7\" (UID: \"4c967d5f-f1f8-496d-bae9-6efb0d2659a7\") " Feb 24 10:14:56 crc kubenswrapper[4985]: I0224 10:14:56.044658 4985 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9f5kl\" (UniqueName: \"kubernetes.io/projected/4c967d5f-f1f8-496d-bae9-6efb0d2659a7-kube-api-access-9f5kl\") pod \"4c967d5f-f1f8-496d-bae9-6efb0d2659a7\" (UID: \"4c967d5f-f1f8-496d-bae9-6efb0d2659a7\") " Feb 24 10:14:56 crc kubenswrapper[4985]: I0224 10:14:56.045038 4985 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4c967d5f-f1f8-496d-bae9-6efb0d2659a7-config" (OuterVolumeSpecName: "config") pod "4c967d5f-f1f8-496d-bae9-6efb0d2659a7" (UID: "4c967d5f-f1f8-496d-bae9-6efb0d2659a7"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 24 10:14:56 crc kubenswrapper[4985]: I0224 10:14:56.045601 4985 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4c967d5f-f1f8-496d-bae9-6efb0d2659a7-client-ca" (OuterVolumeSpecName: "client-ca") pod "4c967d5f-f1f8-496d-bae9-6efb0d2659a7" (UID: "4c967d5f-f1f8-496d-bae9-6efb0d2659a7"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 24 10:14:56 crc kubenswrapper[4985]: I0224 10:14:56.048973 4985 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4c967d5f-f1f8-496d-bae9-6efb0d2659a7-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "4c967d5f-f1f8-496d-bae9-6efb0d2659a7" (UID: "4c967d5f-f1f8-496d-bae9-6efb0d2659a7"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 24 10:14:56 crc kubenswrapper[4985]: I0224 10:14:56.049375 4985 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4c967d5f-f1f8-496d-bae9-6efb0d2659a7-kube-api-access-9f5kl" (OuterVolumeSpecName: "kube-api-access-9f5kl") pod "4c967d5f-f1f8-496d-bae9-6efb0d2659a7" (UID: "4c967d5f-f1f8-496d-bae9-6efb0d2659a7"). InnerVolumeSpecName "kube-api-access-9f5kl". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 24 10:14:56 crc kubenswrapper[4985]: I0224 10:14:56.146108 4985 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/4c967d5f-f1f8-496d-bae9-6efb0d2659a7-client-ca\") on node \"crc\" DevicePath \"\"" Feb 24 10:14:56 crc kubenswrapper[4985]: I0224 10:14:56.146164 4985 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4c967d5f-f1f8-496d-bae9-6efb0d2659a7-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 24 10:14:56 crc kubenswrapper[4985]: I0224 10:14:56.146186 4985 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4c967d5f-f1f8-496d-bae9-6efb0d2659a7-config\") on node \"crc\" DevicePath \"\"" Feb 24 10:14:56 crc kubenswrapper[4985]: I0224 10:14:56.146205 4985 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9f5kl\" (UniqueName: \"kubernetes.io/projected/4c967d5f-f1f8-496d-bae9-6efb0d2659a7-kube-api-access-9f5kl\") on node \"crc\" DevicePath \"\"" Feb 24 10:14:56 crc kubenswrapper[4985]: I0224 10:14:56.603641 4985 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-f5dbb47b7-wxcvn" Feb 24 10:14:56 crc kubenswrapper[4985]: I0224 10:14:56.732476 4985 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Feb 24 10:14:56 crc kubenswrapper[4985]: I0224 10:14:56.759264 4985 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/62c3035c-c1a2-424a-af19-cb9b9b5aa2d9-proxy-ca-bundles\") pod \"62c3035c-c1a2-424a-af19-cb9b9b5aa2d9\" (UID: \"62c3035c-c1a2-424a-af19-cb9b9b5aa2d9\") " Feb 24 10:14:56 crc kubenswrapper[4985]: I0224 10:14:56.759381 4985 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/62c3035c-c1a2-424a-af19-cb9b9b5aa2d9-config\") pod \"62c3035c-c1a2-424a-af19-cb9b9b5aa2d9\" (UID: \"62c3035c-c1a2-424a-af19-cb9b9b5aa2d9\") " Feb 24 10:14:56 crc kubenswrapper[4985]: I0224 10:14:56.759700 4985 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/62c3035c-c1a2-424a-af19-cb9b9b5aa2d9-serving-cert\") pod \"62c3035c-c1a2-424a-af19-cb9b9b5aa2d9\" (UID: \"62c3035c-c1a2-424a-af19-cb9b9b5aa2d9\") " Feb 24 10:14:56 crc kubenswrapper[4985]: I0224 10:14:56.759763 4985 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z7zxg\" (UniqueName: \"kubernetes.io/projected/62c3035c-c1a2-424a-af19-cb9b9b5aa2d9-kube-api-access-z7zxg\") pod \"62c3035c-c1a2-424a-af19-cb9b9b5aa2d9\" (UID: \"62c3035c-c1a2-424a-af19-cb9b9b5aa2d9\") " Feb 24 10:14:56 crc kubenswrapper[4985]: I0224 10:14:56.759861 4985 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/62c3035c-c1a2-424a-af19-cb9b9b5aa2d9-client-ca\") pod \"62c3035c-c1a2-424a-af19-cb9b9b5aa2d9\" (UID: \"62c3035c-c1a2-424a-af19-cb9b9b5aa2d9\") " Feb 24 10:14:56 crc kubenswrapper[4985]: I0224 10:14:56.760300 4985 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/62c3035c-c1a2-424a-af19-cb9b9b5aa2d9-client-ca" (OuterVolumeSpecName: "client-ca") pod "62c3035c-c1a2-424a-af19-cb9b9b5aa2d9" (UID: "62c3035c-c1a2-424a-af19-cb9b9b5aa2d9"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 24 10:14:56 crc kubenswrapper[4985]: I0224 10:14:56.760331 4985 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/62c3035c-c1a2-424a-af19-cb9b9b5aa2d9-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "62c3035c-c1a2-424a-af19-cb9b9b5aa2d9" (UID: "62c3035c-c1a2-424a-af19-cb9b9b5aa2d9"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 24 10:14:56 crc kubenswrapper[4985]: I0224 10:14:56.760551 4985 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/62c3035c-c1a2-424a-af19-cb9b9b5aa2d9-config" (OuterVolumeSpecName: "config") pod "62c3035c-c1a2-424a-af19-cb9b9b5aa2d9" (UID: "62c3035c-c1a2-424a-af19-cb9b9b5aa2d9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 24 10:14:56 crc kubenswrapper[4985]: I0224 10:14:56.764139 4985 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/62c3035c-c1a2-424a-af19-cb9b9b5aa2d9-kube-api-access-z7zxg" (OuterVolumeSpecName: "kube-api-access-z7zxg") pod "62c3035c-c1a2-424a-af19-cb9b9b5aa2d9" (UID: "62c3035c-c1a2-424a-af19-cb9b9b5aa2d9"). InnerVolumeSpecName "kube-api-access-z7zxg". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 24 10:14:56 crc kubenswrapper[4985]: I0224 10:14:56.765104 4985 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/62c3035c-c1a2-424a-af19-cb9b9b5aa2d9-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "62c3035c-c1a2-424a-af19-cb9b9b5aa2d9" (UID: "62c3035c-c1a2-424a-af19-cb9b9b5aa2d9"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 24 10:14:56 crc kubenswrapper[4985]: I0224 10:14:56.806021 4985 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-5cb5c6584f-49rxc" event={"ID":"4c967d5f-f1f8-496d-bae9-6efb0d2659a7","Type":"ContainerDied","Data":"97788e80475ed9ea2b4e0e959a8982e61e3ea23fb167779f4d09dc9624fcd050"} Feb 24 10:14:56 crc kubenswrapper[4985]: I0224 10:14:56.806121 4985 scope.go:117] "RemoveContainer" containerID="15daa75565e0d6dbb27a24dd51be7a16e21fe5832f77c628ab7ca9982e2817d3" Feb 24 10:14:56 crc kubenswrapper[4985]: I0224 10:14:56.806031 4985 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-5cb5c6584f-49rxc" Feb 24 10:14:56 crc kubenswrapper[4985]: I0224 10:14:56.808992 4985 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-f5dbb47b7-wxcvn" event={"ID":"62c3035c-c1a2-424a-af19-cb9b9b5aa2d9","Type":"ContainerDied","Data":"26bd2bf463f40cbab2155910d63a5421dacc8230007dbaa5937ff7f296924192"} Feb 24 10:14:56 crc kubenswrapper[4985]: I0224 10:14:56.809017 4985 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-f5dbb47b7-wxcvn" Feb 24 10:14:56 crc kubenswrapper[4985]: I0224 10:14:56.825849 4985 scope.go:117] "RemoveContainer" containerID="be64f85b472ebad3cb64613ec7afac0fff17f4e03b8defda3f053ea9787a81e6" Feb 24 10:14:56 crc kubenswrapper[4985]: I0224 10:14:56.826721 4985 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5cb5c6584f-49rxc"] Feb 24 10:14:56 crc kubenswrapper[4985]: I0224 10:14:56.831178 4985 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5cb5c6584f-49rxc"] Feb 24 10:14:56 crc kubenswrapper[4985]: I0224 10:14:56.838225 4985 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-f5dbb47b7-wxcvn"] Feb 24 10:14:56 crc kubenswrapper[4985]: I0224 10:14:56.843436 4985 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-f5dbb47b7-wxcvn"] Feb 24 10:14:56 crc kubenswrapper[4985]: I0224 10:14:56.860947 4985 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/62c3035c-c1a2-424a-af19-cb9b9b5aa2d9-client-ca\") on node \"crc\" DevicePath \"\"" Feb 24 10:14:56 crc kubenswrapper[4985]: I0224 10:14:56.860966 4985 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/62c3035c-c1a2-424a-af19-cb9b9b5aa2d9-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Feb 24 10:14:56 crc kubenswrapper[4985]: I0224 10:14:56.860975 4985 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/62c3035c-c1a2-424a-af19-cb9b9b5aa2d9-config\") on node \"crc\" DevicePath \"\"" Feb 24 10:14:56 crc kubenswrapper[4985]: I0224 10:14:56.860983 4985 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/62c3035c-c1a2-424a-af19-cb9b9b5aa2d9-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 24 10:14:56 crc kubenswrapper[4985]: I0224 10:14:56.860992 4985 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z7zxg\" (UniqueName: \"kubernetes.io/projected/62c3035c-c1a2-424a-af19-cb9b9b5aa2d9-kube-api-access-z7zxg\") on node \"crc\" DevicePath \"\"" Feb 24 10:14:57 crc kubenswrapper[4985]: I0224 10:14:57.095829 4985 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7674b6c49c-9x9pl"] Feb 24 10:14:57 crc kubenswrapper[4985]: E0224 10:14:57.096073 4985 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4c967d5f-f1f8-496d-bae9-6efb0d2659a7" containerName="route-controller-manager" Feb 24 10:14:57 crc kubenswrapper[4985]: I0224 10:14:57.096088 4985 state_mem.go:107] "Deleted CPUSet assignment" podUID="4c967d5f-f1f8-496d-bae9-6efb0d2659a7" containerName="route-controller-manager" Feb 24 10:14:57 crc kubenswrapper[4985]: E0224 10:14:57.096113 4985 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="62c3035c-c1a2-424a-af19-cb9b9b5aa2d9" containerName="controller-manager" Feb 24 10:14:57 crc kubenswrapper[4985]: I0224 10:14:57.096121 4985 state_mem.go:107] "Deleted CPUSet assignment" podUID="62c3035c-c1a2-424a-af19-cb9b9b5aa2d9" containerName="controller-manager" Feb 24 10:14:57 crc kubenswrapper[4985]: I0224 10:14:57.096230 4985 memory_manager.go:354] "RemoveStaleState removing state" podUID="4c967d5f-f1f8-496d-bae9-6efb0d2659a7" containerName="route-controller-manager" Feb 24 10:14:57 crc kubenswrapper[4985]: I0224 10:14:57.096255 4985 memory_manager.go:354] "RemoveStaleState removing state" podUID="62c3035c-c1a2-424a-af19-cb9b9b5aa2d9" containerName="controller-manager" Feb 24 10:14:57 crc kubenswrapper[4985]: I0224 10:14:57.096739 4985 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-7674b6c49c-9x9pl" Feb 24 10:14:57 crc kubenswrapper[4985]: I0224 10:14:57.099862 4985 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Feb 24 10:14:57 crc kubenswrapper[4985]: I0224 10:14:57.100506 4985 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Feb 24 10:14:57 crc kubenswrapper[4985]: I0224 10:14:57.100996 4985 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Feb 24 10:14:57 crc kubenswrapper[4985]: I0224 10:14:57.101444 4985 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Feb 24 10:14:57 crc kubenswrapper[4985]: I0224 10:14:57.102554 4985 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Feb 24 10:14:57 crc kubenswrapper[4985]: I0224 10:14:57.104001 4985 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-78bc8df5d9-drtqs"] Feb 24 10:14:57 crc kubenswrapper[4985]: I0224 10:14:57.104776 4985 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-78bc8df5d9-drtqs" Feb 24 10:14:57 crc kubenswrapper[4985]: I0224 10:14:57.112282 4985 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Feb 24 10:14:57 crc kubenswrapper[4985]: I0224 10:14:57.112882 4985 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Feb 24 10:14:57 crc kubenswrapper[4985]: I0224 10:14:57.113033 4985 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Feb 24 10:14:57 crc kubenswrapper[4985]: I0224 10:14:57.114503 4985 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Feb 24 10:14:57 crc kubenswrapper[4985]: I0224 10:14:57.114718 4985 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Feb 24 10:14:57 crc kubenswrapper[4985]: I0224 10:14:57.114851 4985 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Feb 24 10:14:57 crc kubenswrapper[4985]: I0224 10:14:57.114962 4985 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Feb 24 10:14:57 crc kubenswrapper[4985]: I0224 10:14:57.118736 4985 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7674b6c49c-9x9pl"] Feb 24 10:14:57 crc kubenswrapper[4985]: I0224 10:14:57.124317 4985 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-78bc8df5d9-drtqs"] Feb 24 10:14:57 crc kubenswrapper[4985]: I0224 10:14:57.125802 4985 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Feb 24 10:14:57 crc kubenswrapper[4985]: I0224 10:14:57.265229 4985 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/fced2696-6bd1-4042-9e03-0ff2212ed90f-client-ca\") pod \"controller-manager-78bc8df5d9-drtqs\" (UID: \"fced2696-6bd1-4042-9e03-0ff2212ed90f\") " pod="openshift-controller-manager/controller-manager-78bc8df5d9-drtqs" Feb 24 10:14:57 crc kubenswrapper[4985]: I0224 10:14:57.265537 4985 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/05bd42bc-7926-4f6e-8da4-da4cd0426aa7-serving-cert\") pod \"route-controller-manager-7674b6c49c-9x9pl\" (UID: \"05bd42bc-7926-4f6e-8da4-da4cd0426aa7\") " pod="openshift-route-controller-manager/route-controller-manager-7674b6c49c-9x9pl" Feb 24 10:14:57 crc kubenswrapper[4985]: I0224 10:14:57.265690 4985 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fced2696-6bd1-4042-9e03-0ff2212ed90f-serving-cert\") pod \"controller-manager-78bc8df5d9-drtqs\" (UID: \"fced2696-6bd1-4042-9e03-0ff2212ed90f\") " pod="openshift-controller-manager/controller-manager-78bc8df5d9-drtqs" Feb 24 10:14:57 crc kubenswrapper[4985]: I0224 10:14:57.265812 4985 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/05bd42bc-7926-4f6e-8da4-da4cd0426aa7-client-ca\") pod \"route-controller-manager-7674b6c49c-9x9pl\" (UID: \"05bd42bc-7926-4f6e-8da4-da4cd0426aa7\") " pod="openshift-route-controller-manager/route-controller-manager-7674b6c49c-9x9pl" Feb 24 10:14:57 crc kubenswrapper[4985]: I0224 10:14:57.265953 4985 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rf6wg\" (UniqueName: \"kubernetes.io/projected/05bd42bc-7926-4f6e-8da4-da4cd0426aa7-kube-api-access-rf6wg\") pod \"route-controller-manager-7674b6c49c-9x9pl\" (UID: \"05bd42bc-7926-4f6e-8da4-da4cd0426aa7\") " pod="openshift-route-controller-manager/route-controller-manager-7674b6c49c-9x9pl" Feb 24 10:14:57 crc kubenswrapper[4985]: I0224 10:14:57.266232 4985 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jwv94\" (UniqueName: \"kubernetes.io/projected/fced2696-6bd1-4042-9e03-0ff2212ed90f-kube-api-access-jwv94\") pod \"controller-manager-78bc8df5d9-drtqs\" (UID: \"fced2696-6bd1-4042-9e03-0ff2212ed90f\") " pod="openshift-controller-manager/controller-manager-78bc8df5d9-drtqs" Feb 24 10:14:57 crc kubenswrapper[4985]: I0224 10:14:57.266374 4985 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/05bd42bc-7926-4f6e-8da4-da4cd0426aa7-config\") pod \"route-controller-manager-7674b6c49c-9x9pl\" (UID: \"05bd42bc-7926-4f6e-8da4-da4cd0426aa7\") " pod="openshift-route-controller-manager/route-controller-manager-7674b6c49c-9x9pl" Feb 24 10:14:57 crc kubenswrapper[4985]: I0224 10:14:57.266500 4985 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/fced2696-6bd1-4042-9e03-0ff2212ed90f-proxy-ca-bundles\") pod \"controller-manager-78bc8df5d9-drtqs\" (UID: \"fced2696-6bd1-4042-9e03-0ff2212ed90f\") " pod="openshift-controller-manager/controller-manager-78bc8df5d9-drtqs" Feb 24 10:14:57 crc kubenswrapper[4985]: I0224 10:14:57.266623 4985 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fced2696-6bd1-4042-9e03-0ff2212ed90f-config\") pod \"controller-manager-78bc8df5d9-drtqs\" (UID: \"fced2696-6bd1-4042-9e03-0ff2212ed90f\") " pod="openshift-controller-manager/controller-manager-78bc8df5d9-drtqs" Feb 24 10:14:57 crc kubenswrapper[4985]: I0224 10:14:57.367336 4985 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jwv94\" (UniqueName: \"kubernetes.io/projected/fced2696-6bd1-4042-9e03-0ff2212ed90f-kube-api-access-jwv94\") pod \"controller-manager-78bc8df5d9-drtqs\" (UID: \"fced2696-6bd1-4042-9e03-0ff2212ed90f\") " pod="openshift-controller-manager/controller-manager-78bc8df5d9-drtqs" Feb 24 10:14:57 crc kubenswrapper[4985]: I0224 10:14:57.367457 4985 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/05bd42bc-7926-4f6e-8da4-da4cd0426aa7-config\") pod \"route-controller-manager-7674b6c49c-9x9pl\" (UID: \"05bd42bc-7926-4f6e-8da4-da4cd0426aa7\") " pod="openshift-route-controller-manager/route-controller-manager-7674b6c49c-9x9pl" Feb 24 10:14:57 crc kubenswrapper[4985]: I0224 10:14:57.367551 4985 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/fced2696-6bd1-4042-9e03-0ff2212ed90f-proxy-ca-bundles\") pod \"controller-manager-78bc8df5d9-drtqs\" (UID: \"fced2696-6bd1-4042-9e03-0ff2212ed90f\") " pod="openshift-controller-manager/controller-manager-78bc8df5d9-drtqs" Feb 24 10:14:57 crc kubenswrapper[4985]: I0224 10:14:57.367661 4985 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fced2696-6bd1-4042-9e03-0ff2212ed90f-config\") pod \"controller-manager-78bc8df5d9-drtqs\" (UID: \"fced2696-6bd1-4042-9e03-0ff2212ed90f\") " pod="openshift-controller-manager/controller-manager-78bc8df5d9-drtqs" Feb 24 10:14:57 crc kubenswrapper[4985]: I0224 10:14:57.367709 4985 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/fced2696-6bd1-4042-9e03-0ff2212ed90f-client-ca\") pod \"controller-manager-78bc8df5d9-drtqs\" (UID: \"fced2696-6bd1-4042-9e03-0ff2212ed90f\") " pod="openshift-controller-manager/controller-manager-78bc8df5d9-drtqs" Feb 24 10:14:57 crc kubenswrapper[4985]: I0224 10:14:57.367752 4985 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/05bd42bc-7926-4f6e-8da4-da4cd0426aa7-serving-cert\") pod \"route-controller-manager-7674b6c49c-9x9pl\" (UID: \"05bd42bc-7926-4f6e-8da4-da4cd0426aa7\") " pod="openshift-route-controller-manager/route-controller-manager-7674b6c49c-9x9pl" Feb 24 10:14:57 crc kubenswrapper[4985]: I0224 10:14:57.367964 4985 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fced2696-6bd1-4042-9e03-0ff2212ed90f-serving-cert\") pod \"controller-manager-78bc8df5d9-drtqs\" (UID: \"fced2696-6bd1-4042-9e03-0ff2212ed90f\") " pod="openshift-controller-manager/controller-manager-78bc8df5d9-drtqs" Feb 24 10:14:57 crc kubenswrapper[4985]: I0224 10:14:57.368017 4985 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/05bd42bc-7926-4f6e-8da4-da4cd0426aa7-client-ca\") pod \"route-controller-manager-7674b6c49c-9x9pl\" (UID: \"05bd42bc-7926-4f6e-8da4-da4cd0426aa7\") " pod="openshift-route-controller-manager/route-controller-manager-7674b6c49c-9x9pl" Feb 24 10:14:57 crc kubenswrapper[4985]: I0224 10:14:57.368050 4985 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rf6wg\" (UniqueName: \"kubernetes.io/projected/05bd42bc-7926-4f6e-8da4-da4cd0426aa7-kube-api-access-rf6wg\") pod \"route-controller-manager-7674b6c49c-9x9pl\" (UID: \"05bd42bc-7926-4f6e-8da4-da4cd0426aa7\") " pod="openshift-route-controller-manager/route-controller-manager-7674b6c49c-9x9pl" Feb 24 10:14:57 crc kubenswrapper[4985]: I0224 10:14:57.369240 4985 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/05bd42bc-7926-4f6e-8da4-da4cd0426aa7-config\") pod \"route-controller-manager-7674b6c49c-9x9pl\" (UID: \"05bd42bc-7926-4f6e-8da4-da4cd0426aa7\") " pod="openshift-route-controller-manager/route-controller-manager-7674b6c49c-9x9pl" Feb 24 10:14:57 crc kubenswrapper[4985]: I0224 10:14:57.369488 4985 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/05bd42bc-7926-4f6e-8da4-da4cd0426aa7-client-ca\") pod \"route-controller-manager-7674b6c49c-9x9pl\" (UID: \"05bd42bc-7926-4f6e-8da4-da4cd0426aa7\") " pod="openshift-route-controller-manager/route-controller-manager-7674b6c49c-9x9pl" Feb 24 10:14:57 crc kubenswrapper[4985]: I0224 10:14:57.370083 4985 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fced2696-6bd1-4042-9e03-0ff2212ed90f-config\") pod \"controller-manager-78bc8df5d9-drtqs\" (UID: \"fced2696-6bd1-4042-9e03-0ff2212ed90f\") " pod="openshift-controller-manager/controller-manager-78bc8df5d9-drtqs" Feb 24 10:14:57 crc kubenswrapper[4985]: I0224 10:14:57.370349 4985 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/fced2696-6bd1-4042-9e03-0ff2212ed90f-client-ca\") pod \"controller-manager-78bc8df5d9-drtqs\" (UID: \"fced2696-6bd1-4042-9e03-0ff2212ed90f\") " pod="openshift-controller-manager/controller-manager-78bc8df5d9-drtqs" Feb 24 10:14:57 crc kubenswrapper[4985]: I0224 10:14:57.370420 4985 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/fced2696-6bd1-4042-9e03-0ff2212ed90f-proxy-ca-bundles\") pod \"controller-manager-78bc8df5d9-drtqs\" (UID: \"fced2696-6bd1-4042-9e03-0ff2212ed90f\") " pod="openshift-controller-manager/controller-manager-78bc8df5d9-drtqs" Feb 24 10:14:57 crc kubenswrapper[4985]: I0224 10:14:57.374833 4985 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fced2696-6bd1-4042-9e03-0ff2212ed90f-serving-cert\") pod \"controller-manager-78bc8df5d9-drtqs\" (UID: \"fced2696-6bd1-4042-9e03-0ff2212ed90f\") " pod="openshift-controller-manager/controller-manager-78bc8df5d9-drtqs" Feb 24 10:14:57 crc kubenswrapper[4985]: I0224 10:14:57.375880 4985 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/05bd42bc-7926-4f6e-8da4-da4cd0426aa7-serving-cert\") pod \"route-controller-manager-7674b6c49c-9x9pl\" (UID: \"05bd42bc-7926-4f6e-8da4-da4cd0426aa7\") " pod="openshift-route-controller-manager/route-controller-manager-7674b6c49c-9x9pl" Feb 24 10:14:57 crc kubenswrapper[4985]: I0224 10:14:57.399350 4985 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jwv94\" (UniqueName: \"kubernetes.io/projected/fced2696-6bd1-4042-9e03-0ff2212ed90f-kube-api-access-jwv94\") pod \"controller-manager-78bc8df5d9-drtqs\" (UID: \"fced2696-6bd1-4042-9e03-0ff2212ed90f\") " pod="openshift-controller-manager/controller-manager-78bc8df5d9-drtqs" Feb 24 10:14:57 crc kubenswrapper[4985]: I0224 10:14:57.402347 4985 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rf6wg\" (UniqueName: \"kubernetes.io/projected/05bd42bc-7926-4f6e-8da4-da4cd0426aa7-kube-api-access-rf6wg\") pod \"route-controller-manager-7674b6c49c-9x9pl\" (UID: \"05bd42bc-7926-4f6e-8da4-da4cd0426aa7\") " pod="openshift-route-controller-manager/route-controller-manager-7674b6c49c-9x9pl" Feb 24 10:14:57 crc kubenswrapper[4985]: I0224 10:14:57.427327 4985 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-7674b6c49c-9x9pl" Feb 24 10:14:57 crc kubenswrapper[4985]: I0224 10:14:57.442467 4985 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-78bc8df5d9-drtqs" Feb 24 10:14:57 crc kubenswrapper[4985]: I0224 10:14:57.806068 4985 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7674b6c49c-9x9pl"] Feb 24 10:14:57 crc kubenswrapper[4985]: I0224 10:14:57.823853 4985 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-7674b6c49c-9x9pl" event={"ID":"05bd42bc-7926-4f6e-8da4-da4cd0426aa7","Type":"ContainerStarted","Data":"d86918383a619a0aa2e0ec10981159b066fd5dab3a7cfc07739c550f0cfbc5f6"} Feb 24 10:14:57 crc kubenswrapper[4985]: I0224 10:14:57.842521 4985 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-78bc8df5d9-drtqs"] Feb 24 10:14:57 crc kubenswrapper[4985]: W0224 10:14:57.846670 4985 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfced2696_6bd1_4042_9e03_0ff2212ed90f.slice/crio-e246d60ba5b95c58736884e86391109e14e1e8847da8c01b1a774f7073a36453 WatchSource:0}: Error finding container e246d60ba5b95c58736884e86391109e14e1e8847da8c01b1a774f7073a36453: Status 404 returned error can't find the container with id e246d60ba5b95c58736884e86391109e14e1e8847da8c01b1a774f7073a36453 Feb 24 10:14:58 crc kubenswrapper[4985]: I0224 10:14:58.271637 4985 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4c967d5f-f1f8-496d-bae9-6efb0d2659a7" path="/var/lib/kubelet/pods/4c967d5f-f1f8-496d-bae9-6efb0d2659a7/volumes" Feb 24 10:14:58 crc kubenswrapper[4985]: I0224 10:14:58.272405 4985 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="62c3035c-c1a2-424a-af19-cb9b9b5aa2d9" path="/var/lib/kubelet/pods/62c3035c-c1a2-424a-af19-cb9b9b5aa2d9/volumes" Feb 24 10:14:58 crc kubenswrapper[4985]: I0224 10:14:58.829875 4985 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-78bc8df5d9-drtqs" event={"ID":"fced2696-6bd1-4042-9e03-0ff2212ed90f","Type":"ContainerStarted","Data":"7cc8c2fc04ab8e47df580c592f183b2479bd8bf20d3bb7e8672e4a7d778e3df5"} Feb 24 10:14:58 crc kubenswrapper[4985]: I0224 10:14:58.829939 4985 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-78bc8df5d9-drtqs" event={"ID":"fced2696-6bd1-4042-9e03-0ff2212ed90f","Type":"ContainerStarted","Data":"e246d60ba5b95c58736884e86391109e14e1e8847da8c01b1a774f7073a36453"} Feb 24 10:14:58 crc kubenswrapper[4985]: I0224 10:14:58.830092 4985 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-78bc8df5d9-drtqs" Feb 24 10:14:58 crc kubenswrapper[4985]: I0224 10:14:58.831356 4985 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-7674b6c49c-9x9pl" event={"ID":"05bd42bc-7926-4f6e-8da4-da4cd0426aa7","Type":"ContainerStarted","Data":"1a754c9bee196676763c034f0e3eaf93d80bae2ad94cdcc96a7ddc989e812730"} Feb 24 10:14:58 crc kubenswrapper[4985]: I0224 10:14:58.831650 4985 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-7674b6c49c-9x9pl" Feb 24 10:14:58 crc kubenswrapper[4985]: I0224 10:14:58.834762 4985 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-78bc8df5d9-drtqs" Feb 24 10:14:58 crc kubenswrapper[4985]: I0224 10:14:58.835550 4985 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-7674b6c49c-9x9pl" Feb 24 10:14:58 crc kubenswrapper[4985]: I0224 10:14:58.849688 4985 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-78bc8df5d9-drtqs" podStartSLOduration=3.8496702259999998 podStartE2EDuration="3.849670226s" podCreationTimestamp="2026-02-24 10:14:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-24 10:14:58.846604019 +0000 UTC m=+383.320796599" watchObservedRunningTime="2026-02-24 10:14:58.849670226 +0000 UTC m=+383.323862786" Feb 24 10:14:58 crc kubenswrapper[4985]: I0224 10:14:58.883354 4985 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-7674b6c49c-9x9pl" podStartSLOduration=3.883337069 podStartE2EDuration="3.883337069s" podCreationTimestamp="2026-02-24 10:14:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-24 10:14:58.882472284 +0000 UTC m=+383.356664864" watchObservedRunningTime="2026-02-24 10:14:58.883337069 +0000 UTC m=+383.357529629" Feb 24 10:14:58 crc kubenswrapper[4985]: I0224 10:14:58.957180 4985 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Feb 24 10:14:58 crc kubenswrapper[4985]: I0224 10:14:58.976202 4985 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Feb 24 10:14:59 crc kubenswrapper[4985]: I0224 10:14:59.493385 4985 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Feb 24 10:14:59 crc kubenswrapper[4985]: I0224 10:14:59.964992 4985 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Feb 24 10:15:00 crc kubenswrapper[4985]: I0224 10:15:00.164570 4985 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29532135-ksdjx"] Feb 24 10:15:00 crc kubenswrapper[4985]: I0224 10:15:00.165269 4985 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29532135-ksdjx" Feb 24 10:15:00 crc kubenswrapper[4985]: I0224 10:15:00.167044 4985 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Feb 24 10:15:00 crc kubenswrapper[4985]: I0224 10:15:00.167138 4985 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Feb 24 10:15:00 crc kubenswrapper[4985]: I0224 10:15:00.174531 4985 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29532135-ksdjx"] Feb 24 10:15:00 crc kubenswrapper[4985]: I0224 10:15:00.300861 4985 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/803e74bd-370e-4592-aead-7853850525af-secret-volume\") pod \"collect-profiles-29532135-ksdjx\" (UID: \"803e74bd-370e-4592-aead-7853850525af\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29532135-ksdjx" Feb 24 10:15:00 crc kubenswrapper[4985]: I0224 10:15:00.301188 4985 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/803e74bd-370e-4592-aead-7853850525af-config-volume\") pod \"collect-profiles-29532135-ksdjx\" (UID: \"803e74bd-370e-4592-aead-7853850525af\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29532135-ksdjx" Feb 24 10:15:00 crc kubenswrapper[4985]: I0224 10:15:00.301369 4985 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5pxd2\" (UniqueName: \"kubernetes.io/projected/803e74bd-370e-4592-aead-7853850525af-kube-api-access-5pxd2\") pod \"collect-profiles-29532135-ksdjx\" (UID: \"803e74bd-370e-4592-aead-7853850525af\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29532135-ksdjx" Feb 24 10:15:00 crc kubenswrapper[4985]: I0224 10:15:00.403161 4985 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/803e74bd-370e-4592-aead-7853850525af-secret-volume\") pod \"collect-profiles-29532135-ksdjx\" (UID: \"803e74bd-370e-4592-aead-7853850525af\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29532135-ksdjx" Feb 24 10:15:00 crc kubenswrapper[4985]: I0224 10:15:00.403261 4985 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/803e74bd-370e-4592-aead-7853850525af-config-volume\") pod \"collect-profiles-29532135-ksdjx\" (UID: \"803e74bd-370e-4592-aead-7853850525af\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29532135-ksdjx" Feb 24 10:15:00 crc kubenswrapper[4985]: I0224 10:15:00.403366 4985 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5pxd2\" (UniqueName: \"kubernetes.io/projected/803e74bd-370e-4592-aead-7853850525af-kube-api-access-5pxd2\") pod \"collect-profiles-29532135-ksdjx\" (UID: \"803e74bd-370e-4592-aead-7853850525af\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29532135-ksdjx" Feb 24 10:15:00 crc kubenswrapper[4985]: I0224 10:15:00.405121 4985 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/803e74bd-370e-4592-aead-7853850525af-config-volume\") pod \"collect-profiles-29532135-ksdjx\" (UID: \"803e74bd-370e-4592-aead-7853850525af\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29532135-ksdjx" Feb 24 10:15:00 crc kubenswrapper[4985]: I0224 10:15:00.412066 4985 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/803e74bd-370e-4592-aead-7853850525af-secret-volume\") pod \"collect-profiles-29532135-ksdjx\" (UID: \"803e74bd-370e-4592-aead-7853850525af\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29532135-ksdjx" Feb 24 10:15:00 crc kubenswrapper[4985]: I0224 10:15:00.427506 4985 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5pxd2\" (UniqueName: \"kubernetes.io/projected/803e74bd-370e-4592-aead-7853850525af-kube-api-access-5pxd2\") pod \"collect-profiles-29532135-ksdjx\" (UID: \"803e74bd-370e-4592-aead-7853850525af\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29532135-ksdjx" Feb 24 10:15:00 crc kubenswrapper[4985]: I0224 10:15:00.670215 4985 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29532135-ksdjx" Feb 24 10:15:01 crc kubenswrapper[4985]: I0224 10:15:01.090337 4985 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29532135-ksdjx"] Feb 24 10:15:01 crc kubenswrapper[4985]: W0224 10:15:01.095114 4985 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod803e74bd_370e_4592_aead_7853850525af.slice/crio-7f16048b3c5dc8ca6160b43f31ddfab817aa34043b228ce70f208c4c00df9c71 WatchSource:0}: Error finding container 7f16048b3c5dc8ca6160b43f31ddfab817aa34043b228ce70f208c4c00df9c71: Status 404 returned error can't find the container with id 7f16048b3c5dc8ca6160b43f31ddfab817aa34043b228ce70f208c4c00df9c71 Feb 24 10:15:01 crc kubenswrapper[4985]: I0224 10:15:01.845031 4985 generic.go:334] "Generic (PLEG): container finished" podID="803e74bd-370e-4592-aead-7853850525af" containerID="7cbce9f967b4430937848ad5bd1a4d708ac28167bd0cffd44247a83c453c281b" exitCode=0 Feb 24 10:15:01 crc kubenswrapper[4985]: I0224 10:15:01.845104 4985 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29532135-ksdjx" event={"ID":"803e74bd-370e-4592-aead-7853850525af","Type":"ContainerDied","Data":"7cbce9f967b4430937848ad5bd1a4d708ac28167bd0cffd44247a83c453c281b"} Feb 24 10:15:01 crc kubenswrapper[4985]: I0224 10:15:01.845416 4985 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29532135-ksdjx" event={"ID":"803e74bd-370e-4592-aead-7853850525af","Type":"ContainerStarted","Data":"7f16048b3c5dc8ca6160b43f31ddfab817aa34043b228ce70f208c4c00df9c71"} Feb 24 10:15:03 crc kubenswrapper[4985]: I0224 10:15:03.149664 4985 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29532135-ksdjx" Feb 24 10:15:03 crc kubenswrapper[4985]: I0224 10:15:03.337804 4985 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/803e74bd-370e-4592-aead-7853850525af-secret-volume\") pod \"803e74bd-370e-4592-aead-7853850525af\" (UID: \"803e74bd-370e-4592-aead-7853850525af\") " Feb 24 10:15:03 crc kubenswrapper[4985]: I0224 10:15:03.337851 4985 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/803e74bd-370e-4592-aead-7853850525af-config-volume\") pod \"803e74bd-370e-4592-aead-7853850525af\" (UID: \"803e74bd-370e-4592-aead-7853850525af\") " Feb 24 10:15:03 crc kubenswrapper[4985]: I0224 10:15:03.337873 4985 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5pxd2\" (UniqueName: \"kubernetes.io/projected/803e74bd-370e-4592-aead-7853850525af-kube-api-access-5pxd2\") pod \"803e74bd-370e-4592-aead-7853850525af\" (UID: \"803e74bd-370e-4592-aead-7853850525af\") " Feb 24 10:15:03 crc kubenswrapper[4985]: I0224 10:15:03.338595 4985 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/803e74bd-370e-4592-aead-7853850525af-config-volume" (OuterVolumeSpecName: "config-volume") pod "803e74bd-370e-4592-aead-7853850525af" (UID: "803e74bd-370e-4592-aead-7853850525af"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 24 10:15:03 crc kubenswrapper[4985]: I0224 10:15:03.342550 4985 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/803e74bd-370e-4592-aead-7853850525af-kube-api-access-5pxd2" (OuterVolumeSpecName: "kube-api-access-5pxd2") pod "803e74bd-370e-4592-aead-7853850525af" (UID: "803e74bd-370e-4592-aead-7853850525af"). InnerVolumeSpecName "kube-api-access-5pxd2". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 24 10:15:03 crc kubenswrapper[4985]: I0224 10:15:03.342618 4985 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/803e74bd-370e-4592-aead-7853850525af-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "803e74bd-370e-4592-aead-7853850525af" (UID: "803e74bd-370e-4592-aead-7853850525af"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 24 10:15:03 crc kubenswrapper[4985]: I0224 10:15:03.438711 4985 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/803e74bd-370e-4592-aead-7853850525af-secret-volume\") on node \"crc\" DevicePath \"\"" Feb 24 10:15:03 crc kubenswrapper[4985]: I0224 10:15:03.438745 4985 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/803e74bd-370e-4592-aead-7853850525af-config-volume\") on node \"crc\" DevicePath \"\"" Feb 24 10:15:03 crc kubenswrapper[4985]: I0224 10:15:03.438756 4985 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5pxd2\" (UniqueName: \"kubernetes.io/projected/803e74bd-370e-4592-aead-7853850525af-kube-api-access-5pxd2\") on node \"crc\" DevicePath \"\"" Feb 24 10:15:03 crc kubenswrapper[4985]: I0224 10:15:03.855505 4985 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29532135-ksdjx" event={"ID":"803e74bd-370e-4592-aead-7853850525af","Type":"ContainerDied","Data":"7f16048b3c5dc8ca6160b43f31ddfab817aa34043b228ce70f208c4c00df9c71"} Feb 24 10:15:03 crc kubenswrapper[4985]: I0224 10:15:03.855706 4985 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7f16048b3c5dc8ca6160b43f31ddfab817aa34043b228ce70f208c4c00df9c71" Feb 24 10:15:03 crc kubenswrapper[4985]: I0224 10:15:03.855842 4985 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29532135-ksdjx" Feb 24 10:15:04 crc kubenswrapper[4985]: I0224 10:15:04.708123 4985 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Feb 24 10:15:04 crc kubenswrapper[4985]: I0224 10:15:04.920928 4985 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Feb 24 10:15:05 crc kubenswrapper[4985]: I0224 10:15:05.392750 4985 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Feb 24 10:15:07 crc kubenswrapper[4985]: I0224 10:15:07.578287 4985 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Feb 24 10:15:08 crc kubenswrapper[4985]: I0224 10:15:08.300829 4985 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Feb 24 10:15:13 crc kubenswrapper[4985]: I0224 10:15:13.625050 4985 patch_prober.go:28] interesting pod/machine-config-daemon-hq52w container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 24 10:15:13 crc kubenswrapper[4985]: I0224 10:15:13.625534 4985 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hq52w" podUID="11c1c7b8-18df-4583-849f-76b62544344b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 24 10:15:13 crc kubenswrapper[4985]: I0224 10:15:13.995832 4985 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Feb 24 10:15:43 crc kubenswrapper[4985]: I0224 10:15:43.624847 4985 patch_prober.go:28] interesting pod/machine-config-daemon-hq52w container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 24 10:15:43 crc kubenswrapper[4985]: I0224 10:15:43.625388 4985 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hq52w" podUID="11c1c7b8-18df-4583-849f-76b62544344b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 24 10:15:55 crc kubenswrapper[4985]: I0224 10:15:55.261188 4985 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7674b6c49c-9x9pl"] Feb 24 10:15:55 crc kubenswrapper[4985]: I0224 10:15:55.262178 4985 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-7674b6c49c-9x9pl" podUID="05bd42bc-7926-4f6e-8da4-da4cd0426aa7" containerName="route-controller-manager" containerID="cri-o://1a754c9bee196676763c034f0e3eaf93d80bae2ad94cdcc96a7ddc989e812730" gracePeriod=30 Feb 24 10:15:55 crc kubenswrapper[4985]: I0224 10:15:55.271758 4985 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-78bc8df5d9-drtqs"] Feb 24 10:15:55 crc kubenswrapper[4985]: I0224 10:15:55.272074 4985 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-78bc8df5d9-drtqs" podUID="fced2696-6bd1-4042-9e03-0ff2212ed90f" containerName="controller-manager" containerID="cri-o://7cc8c2fc04ab8e47df580c592f183b2479bd8bf20d3bb7e8672e4a7d778e3df5" gracePeriod=30 Feb 24 10:15:55 crc kubenswrapper[4985]: I0224 10:15:55.641984 4985 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-78bc8df5d9-drtqs" Feb 24 10:15:55 crc kubenswrapper[4985]: I0224 10:15:55.717716 4985 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-7674b6c49c-9x9pl" Feb 24 10:15:55 crc kubenswrapper[4985]: I0224 10:15:55.773077 4985 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fced2696-6bd1-4042-9e03-0ff2212ed90f-serving-cert\") pod \"fced2696-6bd1-4042-9e03-0ff2212ed90f\" (UID: \"fced2696-6bd1-4042-9e03-0ff2212ed90f\") " Feb 24 10:15:55 crc kubenswrapper[4985]: I0224 10:15:55.773206 4985 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/fced2696-6bd1-4042-9e03-0ff2212ed90f-proxy-ca-bundles\") pod \"fced2696-6bd1-4042-9e03-0ff2212ed90f\" (UID: \"fced2696-6bd1-4042-9e03-0ff2212ed90f\") " Feb 24 10:15:55 crc kubenswrapper[4985]: I0224 10:15:55.773263 4985 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jwv94\" (UniqueName: \"kubernetes.io/projected/fced2696-6bd1-4042-9e03-0ff2212ed90f-kube-api-access-jwv94\") pod \"fced2696-6bd1-4042-9e03-0ff2212ed90f\" (UID: \"fced2696-6bd1-4042-9e03-0ff2212ed90f\") " Feb 24 10:15:55 crc kubenswrapper[4985]: I0224 10:15:55.773309 4985 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/fced2696-6bd1-4042-9e03-0ff2212ed90f-client-ca\") pod \"fced2696-6bd1-4042-9e03-0ff2212ed90f\" (UID: \"fced2696-6bd1-4042-9e03-0ff2212ed90f\") " Feb 24 10:15:55 crc kubenswrapper[4985]: I0224 10:15:55.773393 4985 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fced2696-6bd1-4042-9e03-0ff2212ed90f-config\") pod \"fced2696-6bd1-4042-9e03-0ff2212ed90f\" (UID: \"fced2696-6bd1-4042-9e03-0ff2212ed90f\") " Feb 24 10:15:55 crc kubenswrapper[4985]: I0224 10:15:55.775299 4985 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fced2696-6bd1-4042-9e03-0ff2212ed90f-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "fced2696-6bd1-4042-9e03-0ff2212ed90f" (UID: "fced2696-6bd1-4042-9e03-0ff2212ed90f"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 24 10:15:55 crc kubenswrapper[4985]: I0224 10:15:55.775763 4985 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fced2696-6bd1-4042-9e03-0ff2212ed90f-config" (OuterVolumeSpecName: "config") pod "fced2696-6bd1-4042-9e03-0ff2212ed90f" (UID: "fced2696-6bd1-4042-9e03-0ff2212ed90f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 24 10:15:55 crc kubenswrapper[4985]: I0224 10:15:55.776615 4985 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fced2696-6bd1-4042-9e03-0ff2212ed90f-client-ca" (OuterVolumeSpecName: "client-ca") pod "fced2696-6bd1-4042-9e03-0ff2212ed90f" (UID: "fced2696-6bd1-4042-9e03-0ff2212ed90f"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 24 10:15:55 crc kubenswrapper[4985]: I0224 10:15:55.780722 4985 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fced2696-6bd1-4042-9e03-0ff2212ed90f-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "fced2696-6bd1-4042-9e03-0ff2212ed90f" (UID: "fced2696-6bd1-4042-9e03-0ff2212ed90f"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 24 10:15:55 crc kubenswrapper[4985]: I0224 10:15:55.781140 4985 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fced2696-6bd1-4042-9e03-0ff2212ed90f-kube-api-access-jwv94" (OuterVolumeSpecName: "kube-api-access-jwv94") pod "fced2696-6bd1-4042-9e03-0ff2212ed90f" (UID: "fced2696-6bd1-4042-9e03-0ff2212ed90f"). InnerVolumeSpecName "kube-api-access-jwv94". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 24 10:15:55 crc kubenswrapper[4985]: I0224 10:15:55.874663 4985 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/05bd42bc-7926-4f6e-8da4-da4cd0426aa7-serving-cert\") pod \"05bd42bc-7926-4f6e-8da4-da4cd0426aa7\" (UID: \"05bd42bc-7926-4f6e-8da4-da4cd0426aa7\") " Feb 24 10:15:55 crc kubenswrapper[4985]: I0224 10:15:55.874730 4985 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/05bd42bc-7926-4f6e-8da4-da4cd0426aa7-config\") pod \"05bd42bc-7926-4f6e-8da4-da4cd0426aa7\" (UID: \"05bd42bc-7926-4f6e-8da4-da4cd0426aa7\") " Feb 24 10:15:55 crc kubenswrapper[4985]: I0224 10:15:55.874752 4985 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rf6wg\" (UniqueName: \"kubernetes.io/projected/05bd42bc-7926-4f6e-8da4-da4cd0426aa7-kube-api-access-rf6wg\") pod \"05bd42bc-7926-4f6e-8da4-da4cd0426aa7\" (UID: \"05bd42bc-7926-4f6e-8da4-da4cd0426aa7\") " Feb 24 10:15:55 crc kubenswrapper[4985]: I0224 10:15:55.874831 4985 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/05bd42bc-7926-4f6e-8da4-da4cd0426aa7-client-ca\") pod \"05bd42bc-7926-4f6e-8da4-da4cd0426aa7\" (UID: \"05bd42bc-7926-4f6e-8da4-da4cd0426aa7\") " Feb 24 10:15:55 crc kubenswrapper[4985]: I0224 10:15:55.875023 4985 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fced2696-6bd1-4042-9e03-0ff2212ed90f-config\") on node \"crc\" DevicePath \"\"" Feb 24 10:15:55 crc kubenswrapper[4985]: I0224 10:15:55.875034 4985 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fced2696-6bd1-4042-9e03-0ff2212ed90f-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 24 10:15:55 crc kubenswrapper[4985]: I0224 10:15:55.875043 4985 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/fced2696-6bd1-4042-9e03-0ff2212ed90f-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Feb 24 10:15:55 crc kubenswrapper[4985]: I0224 10:15:55.875052 4985 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jwv94\" (UniqueName: \"kubernetes.io/projected/fced2696-6bd1-4042-9e03-0ff2212ed90f-kube-api-access-jwv94\") on node \"crc\" DevicePath \"\"" Feb 24 10:15:55 crc kubenswrapper[4985]: I0224 10:15:55.875060 4985 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/fced2696-6bd1-4042-9e03-0ff2212ed90f-client-ca\") on node \"crc\" DevicePath \"\"" Feb 24 10:15:55 crc kubenswrapper[4985]: I0224 10:15:55.875500 4985 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/05bd42bc-7926-4f6e-8da4-da4cd0426aa7-client-ca" (OuterVolumeSpecName: "client-ca") pod "05bd42bc-7926-4f6e-8da4-da4cd0426aa7" (UID: "05bd42bc-7926-4f6e-8da4-da4cd0426aa7"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 24 10:15:55 crc kubenswrapper[4985]: I0224 10:15:55.875690 4985 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/05bd42bc-7926-4f6e-8da4-da4cd0426aa7-config" (OuterVolumeSpecName: "config") pod "05bd42bc-7926-4f6e-8da4-da4cd0426aa7" (UID: "05bd42bc-7926-4f6e-8da4-da4cd0426aa7"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 24 10:15:55 crc kubenswrapper[4985]: I0224 10:15:55.877805 4985 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/05bd42bc-7926-4f6e-8da4-da4cd0426aa7-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "05bd42bc-7926-4f6e-8da4-da4cd0426aa7" (UID: "05bd42bc-7926-4f6e-8da4-da4cd0426aa7"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 24 10:15:55 crc kubenswrapper[4985]: I0224 10:15:55.878132 4985 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/05bd42bc-7926-4f6e-8da4-da4cd0426aa7-kube-api-access-rf6wg" (OuterVolumeSpecName: "kube-api-access-rf6wg") pod "05bd42bc-7926-4f6e-8da4-da4cd0426aa7" (UID: "05bd42bc-7926-4f6e-8da4-da4cd0426aa7"). InnerVolumeSpecName "kube-api-access-rf6wg". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 24 10:15:55 crc kubenswrapper[4985]: I0224 10:15:55.976602 4985 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/05bd42bc-7926-4f6e-8da4-da4cd0426aa7-client-ca\") on node \"crc\" DevicePath \"\"" Feb 24 10:15:55 crc kubenswrapper[4985]: I0224 10:15:55.976649 4985 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/05bd42bc-7926-4f6e-8da4-da4cd0426aa7-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 24 10:15:55 crc kubenswrapper[4985]: I0224 10:15:55.976658 4985 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/05bd42bc-7926-4f6e-8da4-da4cd0426aa7-config\") on node \"crc\" DevicePath \"\"" Feb 24 10:15:55 crc kubenswrapper[4985]: I0224 10:15:55.976667 4985 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rf6wg\" (UniqueName: \"kubernetes.io/projected/05bd42bc-7926-4f6e-8da4-da4cd0426aa7-kube-api-access-rf6wg\") on node \"crc\" DevicePath \"\"" Feb 24 10:15:56 crc kubenswrapper[4985]: I0224 10:15:56.131809 4985 generic.go:334] "Generic (PLEG): container finished" podID="fced2696-6bd1-4042-9e03-0ff2212ed90f" containerID="7cc8c2fc04ab8e47df580c592f183b2479bd8bf20d3bb7e8672e4a7d778e3df5" exitCode=0 Feb 24 10:15:56 crc kubenswrapper[4985]: I0224 10:15:56.131883 4985 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-78bc8df5d9-drtqs" event={"ID":"fced2696-6bd1-4042-9e03-0ff2212ed90f","Type":"ContainerDied","Data":"7cc8c2fc04ab8e47df580c592f183b2479bd8bf20d3bb7e8672e4a7d778e3df5"} Feb 24 10:15:56 crc kubenswrapper[4985]: I0224 10:15:56.131940 4985 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-78bc8df5d9-drtqs" event={"ID":"fced2696-6bd1-4042-9e03-0ff2212ed90f","Type":"ContainerDied","Data":"e246d60ba5b95c58736884e86391109e14e1e8847da8c01b1a774f7073a36453"} Feb 24 10:15:56 crc kubenswrapper[4985]: I0224 10:15:56.132001 4985 scope.go:117] "RemoveContainer" containerID="7cc8c2fc04ab8e47df580c592f183b2479bd8bf20d3bb7e8672e4a7d778e3df5" Feb 24 10:15:56 crc kubenswrapper[4985]: I0224 10:15:56.132110 4985 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-78bc8df5d9-drtqs" Feb 24 10:15:56 crc kubenswrapper[4985]: I0224 10:15:56.137341 4985 generic.go:334] "Generic (PLEG): container finished" podID="05bd42bc-7926-4f6e-8da4-da4cd0426aa7" containerID="1a754c9bee196676763c034f0e3eaf93d80bae2ad94cdcc96a7ddc989e812730" exitCode=0 Feb 24 10:15:56 crc kubenswrapper[4985]: I0224 10:15:56.137392 4985 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-7674b6c49c-9x9pl" event={"ID":"05bd42bc-7926-4f6e-8da4-da4cd0426aa7","Type":"ContainerDied","Data":"1a754c9bee196676763c034f0e3eaf93d80bae2ad94cdcc96a7ddc989e812730"} Feb 24 10:15:56 crc kubenswrapper[4985]: I0224 10:15:56.137422 4985 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-7674b6c49c-9x9pl" event={"ID":"05bd42bc-7926-4f6e-8da4-da4cd0426aa7","Type":"ContainerDied","Data":"d86918383a619a0aa2e0ec10981159b066fd5dab3a7cfc07739c550f0cfbc5f6"} Feb 24 10:15:56 crc kubenswrapper[4985]: I0224 10:15:56.137495 4985 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-7674b6c49c-9x9pl" Feb 24 10:15:56 crc kubenswrapper[4985]: I0224 10:15:56.150588 4985 scope.go:117] "RemoveContainer" containerID="7cc8c2fc04ab8e47df580c592f183b2479bd8bf20d3bb7e8672e4a7d778e3df5" Feb 24 10:15:56 crc kubenswrapper[4985]: E0224 10:15:56.151175 4985 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7cc8c2fc04ab8e47df580c592f183b2479bd8bf20d3bb7e8672e4a7d778e3df5\": container with ID starting with 7cc8c2fc04ab8e47df580c592f183b2479bd8bf20d3bb7e8672e4a7d778e3df5 not found: ID does not exist" containerID="7cc8c2fc04ab8e47df580c592f183b2479bd8bf20d3bb7e8672e4a7d778e3df5" Feb 24 10:15:56 crc kubenswrapper[4985]: I0224 10:15:56.152137 4985 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7cc8c2fc04ab8e47df580c592f183b2479bd8bf20d3bb7e8672e4a7d778e3df5"} err="failed to get container status \"7cc8c2fc04ab8e47df580c592f183b2479bd8bf20d3bb7e8672e4a7d778e3df5\": rpc error: code = NotFound desc = could not find container \"7cc8c2fc04ab8e47df580c592f183b2479bd8bf20d3bb7e8672e4a7d778e3df5\": container with ID starting with 7cc8c2fc04ab8e47df580c592f183b2479bd8bf20d3bb7e8672e4a7d778e3df5 not found: ID does not exist" Feb 24 10:15:56 crc kubenswrapper[4985]: I0224 10:15:56.152184 4985 scope.go:117] "RemoveContainer" containerID="1a754c9bee196676763c034f0e3eaf93d80bae2ad94cdcc96a7ddc989e812730" Feb 24 10:15:56 crc kubenswrapper[4985]: I0224 10:15:56.167488 4985 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-78bc8df5d9-drtqs"] Feb 24 10:15:56 crc kubenswrapper[4985]: I0224 10:15:56.170609 4985 scope.go:117] "RemoveContainer" containerID="1a754c9bee196676763c034f0e3eaf93d80bae2ad94cdcc96a7ddc989e812730" Feb 24 10:15:56 crc kubenswrapper[4985]: I0224 10:15:56.170944 4985 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-78bc8df5d9-drtqs"] Feb 24 10:15:56 crc kubenswrapper[4985]: E0224 10:15:56.171103 4985 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1a754c9bee196676763c034f0e3eaf93d80bae2ad94cdcc96a7ddc989e812730\": container with ID starting with 1a754c9bee196676763c034f0e3eaf93d80bae2ad94cdcc96a7ddc989e812730 not found: ID does not exist" containerID="1a754c9bee196676763c034f0e3eaf93d80bae2ad94cdcc96a7ddc989e812730" Feb 24 10:15:56 crc kubenswrapper[4985]: I0224 10:15:56.171139 4985 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1a754c9bee196676763c034f0e3eaf93d80bae2ad94cdcc96a7ddc989e812730"} err="failed to get container status \"1a754c9bee196676763c034f0e3eaf93d80bae2ad94cdcc96a7ddc989e812730\": rpc error: code = NotFound desc = could not find container \"1a754c9bee196676763c034f0e3eaf93d80bae2ad94cdcc96a7ddc989e812730\": container with ID starting with 1a754c9bee196676763c034f0e3eaf93d80bae2ad94cdcc96a7ddc989e812730 not found: ID does not exist" Feb 24 10:15:56 crc kubenswrapper[4985]: I0224 10:15:56.182903 4985 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7674b6c49c-9x9pl"] Feb 24 10:15:56 crc kubenswrapper[4985]: I0224 10:15:56.185609 4985 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7674b6c49c-9x9pl"] Feb 24 10:15:56 crc kubenswrapper[4985]: I0224 10:15:56.270549 4985 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="05bd42bc-7926-4f6e-8da4-da4cd0426aa7" path="/var/lib/kubelet/pods/05bd42bc-7926-4f6e-8da4-da4cd0426aa7/volumes" Feb 24 10:15:56 crc kubenswrapper[4985]: I0224 10:15:56.271090 4985 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fced2696-6bd1-4042-9e03-0ff2212ed90f" path="/var/lib/kubelet/pods/fced2696-6bd1-4042-9e03-0ff2212ed90f/volumes" Feb 24 10:15:57 crc kubenswrapper[4985]: I0224 10:15:57.134287 4985 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5cb5c6584f-7pws4"] Feb 24 10:15:57 crc kubenswrapper[4985]: E0224 10:15:57.135034 4985 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fced2696-6bd1-4042-9e03-0ff2212ed90f" containerName="controller-manager" Feb 24 10:15:57 crc kubenswrapper[4985]: I0224 10:15:57.135183 4985 state_mem.go:107] "Deleted CPUSet assignment" podUID="fced2696-6bd1-4042-9e03-0ff2212ed90f" containerName="controller-manager" Feb 24 10:15:57 crc kubenswrapper[4985]: E0224 10:15:57.135263 4985 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="05bd42bc-7926-4f6e-8da4-da4cd0426aa7" containerName="route-controller-manager" Feb 24 10:15:57 crc kubenswrapper[4985]: I0224 10:15:57.135343 4985 state_mem.go:107] "Deleted CPUSet assignment" podUID="05bd42bc-7926-4f6e-8da4-da4cd0426aa7" containerName="route-controller-manager" Feb 24 10:15:57 crc kubenswrapper[4985]: E0224 10:15:57.135415 4985 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="803e74bd-370e-4592-aead-7853850525af" containerName="collect-profiles" Feb 24 10:15:57 crc kubenswrapper[4985]: I0224 10:15:57.135488 4985 state_mem.go:107] "Deleted CPUSet assignment" podUID="803e74bd-370e-4592-aead-7853850525af" containerName="collect-profiles" Feb 24 10:15:57 crc kubenswrapper[4985]: I0224 10:15:57.135682 4985 memory_manager.go:354] "RemoveStaleState removing state" podUID="05bd42bc-7926-4f6e-8da4-da4cd0426aa7" containerName="route-controller-manager" Feb 24 10:15:57 crc kubenswrapper[4985]: I0224 10:15:57.135772 4985 memory_manager.go:354] "RemoveStaleState removing state" podUID="803e74bd-370e-4592-aead-7853850525af" containerName="collect-profiles" Feb 24 10:15:57 crc kubenswrapper[4985]: I0224 10:15:57.135849 4985 memory_manager.go:354] "RemoveStaleState removing state" podUID="fced2696-6bd1-4042-9e03-0ff2212ed90f" containerName="controller-manager" Feb 24 10:15:57 crc kubenswrapper[4985]: I0224 10:15:57.136436 4985 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-5cb5c6584f-7pws4" Feb 24 10:15:57 crc kubenswrapper[4985]: I0224 10:15:57.139196 4985 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-f5dbb47b7-ndkwq"] Feb 24 10:15:57 crc kubenswrapper[4985]: I0224 10:15:57.139537 4985 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Feb 24 10:15:57 crc kubenswrapper[4985]: I0224 10:15:57.139931 4985 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-f5dbb47b7-ndkwq" Feb 24 10:15:57 crc kubenswrapper[4985]: I0224 10:15:57.141556 4985 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Feb 24 10:15:57 crc kubenswrapper[4985]: I0224 10:15:57.141818 4985 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Feb 24 10:15:57 crc kubenswrapper[4985]: I0224 10:15:57.141820 4985 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Feb 24 10:15:57 crc kubenswrapper[4985]: I0224 10:15:57.141818 4985 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Feb 24 10:15:57 crc kubenswrapper[4985]: I0224 10:15:57.141867 4985 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Feb 24 10:15:57 crc kubenswrapper[4985]: I0224 10:15:57.143227 4985 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Feb 24 10:15:57 crc kubenswrapper[4985]: I0224 10:15:57.144648 4985 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Feb 24 10:15:57 crc kubenswrapper[4985]: I0224 10:15:57.145089 4985 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Feb 24 10:15:57 crc kubenswrapper[4985]: I0224 10:15:57.145322 4985 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Feb 24 10:15:57 crc kubenswrapper[4985]: I0224 10:15:57.145339 4985 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Feb 24 10:15:57 crc kubenswrapper[4985]: I0224 10:15:57.145371 4985 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Feb 24 10:15:57 crc kubenswrapper[4985]: I0224 10:15:57.152723 4985 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Feb 24 10:15:57 crc kubenswrapper[4985]: I0224 10:15:57.161150 4985 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5cb5c6584f-7pws4"] Feb 24 10:15:57 crc kubenswrapper[4985]: I0224 10:15:57.177781 4985 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-f5dbb47b7-ndkwq"] Feb 24 10:15:57 crc kubenswrapper[4985]: I0224 10:15:57.291691 4985 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/389c450f-515f-41c5-aece-131c0fc93b04-serving-cert\") pod \"route-controller-manager-5cb5c6584f-7pws4\" (UID: \"389c450f-515f-41c5-aece-131c0fc93b04\") " pod="openshift-route-controller-manager/route-controller-manager-5cb5c6584f-7pws4" Feb 24 10:15:57 crc kubenswrapper[4985]: I0224 10:15:57.291739 4985 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/389c450f-515f-41c5-aece-131c0fc93b04-config\") pod \"route-controller-manager-5cb5c6584f-7pws4\" (UID: \"389c450f-515f-41c5-aece-131c0fc93b04\") " pod="openshift-route-controller-manager/route-controller-manager-5cb5c6584f-7pws4" Feb 24 10:15:57 crc kubenswrapper[4985]: I0224 10:15:57.291763 4985 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-shmfk\" (UniqueName: \"kubernetes.io/projected/2c8bfb23-9147-4fbe-bc1a-f3f5df5118e8-kube-api-access-shmfk\") pod \"controller-manager-f5dbb47b7-ndkwq\" (UID: \"2c8bfb23-9147-4fbe-bc1a-f3f5df5118e8\") " pod="openshift-controller-manager/controller-manager-f5dbb47b7-ndkwq" Feb 24 10:15:57 crc kubenswrapper[4985]: I0224 10:15:57.291782 4985 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/2c8bfb23-9147-4fbe-bc1a-f3f5df5118e8-client-ca\") pod \"controller-manager-f5dbb47b7-ndkwq\" (UID: \"2c8bfb23-9147-4fbe-bc1a-f3f5df5118e8\") " pod="openshift-controller-manager/controller-manager-f5dbb47b7-ndkwq" Feb 24 10:15:57 crc kubenswrapper[4985]: I0224 10:15:57.291821 4985 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-28vbv\" (UniqueName: \"kubernetes.io/projected/389c450f-515f-41c5-aece-131c0fc93b04-kube-api-access-28vbv\") pod \"route-controller-manager-5cb5c6584f-7pws4\" (UID: \"389c450f-515f-41c5-aece-131c0fc93b04\") " pod="openshift-route-controller-manager/route-controller-manager-5cb5c6584f-7pws4" Feb 24 10:15:57 crc kubenswrapper[4985]: I0224 10:15:57.291970 4985 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2c8bfb23-9147-4fbe-bc1a-f3f5df5118e8-serving-cert\") pod \"controller-manager-f5dbb47b7-ndkwq\" (UID: \"2c8bfb23-9147-4fbe-bc1a-f3f5df5118e8\") " pod="openshift-controller-manager/controller-manager-f5dbb47b7-ndkwq" Feb 24 10:15:57 crc kubenswrapper[4985]: I0224 10:15:57.292065 4985 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2c8bfb23-9147-4fbe-bc1a-f3f5df5118e8-config\") pod \"controller-manager-f5dbb47b7-ndkwq\" (UID: \"2c8bfb23-9147-4fbe-bc1a-f3f5df5118e8\") " pod="openshift-controller-manager/controller-manager-f5dbb47b7-ndkwq" Feb 24 10:15:57 crc kubenswrapper[4985]: I0224 10:15:57.292113 4985 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/2c8bfb23-9147-4fbe-bc1a-f3f5df5118e8-proxy-ca-bundles\") pod \"controller-manager-f5dbb47b7-ndkwq\" (UID: \"2c8bfb23-9147-4fbe-bc1a-f3f5df5118e8\") " pod="openshift-controller-manager/controller-manager-f5dbb47b7-ndkwq" Feb 24 10:15:57 crc kubenswrapper[4985]: I0224 10:15:57.292200 4985 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/389c450f-515f-41c5-aece-131c0fc93b04-client-ca\") pod \"route-controller-manager-5cb5c6584f-7pws4\" (UID: \"389c450f-515f-41c5-aece-131c0fc93b04\") " pod="openshift-route-controller-manager/route-controller-manager-5cb5c6584f-7pws4" Feb 24 10:15:57 crc kubenswrapper[4985]: I0224 10:15:57.393368 4985 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/389c450f-515f-41c5-aece-131c0fc93b04-serving-cert\") pod \"route-controller-manager-5cb5c6584f-7pws4\" (UID: \"389c450f-515f-41c5-aece-131c0fc93b04\") " pod="openshift-route-controller-manager/route-controller-manager-5cb5c6584f-7pws4" Feb 24 10:15:57 crc kubenswrapper[4985]: I0224 10:15:57.393446 4985 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/389c450f-515f-41c5-aece-131c0fc93b04-config\") pod \"route-controller-manager-5cb5c6584f-7pws4\" (UID: \"389c450f-515f-41c5-aece-131c0fc93b04\") " pod="openshift-route-controller-manager/route-controller-manager-5cb5c6584f-7pws4" Feb 24 10:15:57 crc kubenswrapper[4985]: I0224 10:15:57.393473 4985 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-shmfk\" (UniqueName: \"kubernetes.io/projected/2c8bfb23-9147-4fbe-bc1a-f3f5df5118e8-kube-api-access-shmfk\") pod \"controller-manager-f5dbb47b7-ndkwq\" (UID: \"2c8bfb23-9147-4fbe-bc1a-f3f5df5118e8\") " pod="openshift-controller-manager/controller-manager-f5dbb47b7-ndkwq" Feb 24 10:15:57 crc kubenswrapper[4985]: I0224 10:15:57.393515 4985 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/2c8bfb23-9147-4fbe-bc1a-f3f5df5118e8-client-ca\") pod \"controller-manager-f5dbb47b7-ndkwq\" (UID: \"2c8bfb23-9147-4fbe-bc1a-f3f5df5118e8\") " pod="openshift-controller-manager/controller-manager-f5dbb47b7-ndkwq" Feb 24 10:15:57 crc kubenswrapper[4985]: I0224 10:15:57.393539 4985 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-28vbv\" (UniqueName: \"kubernetes.io/projected/389c450f-515f-41c5-aece-131c0fc93b04-kube-api-access-28vbv\") pod \"route-controller-manager-5cb5c6584f-7pws4\" (UID: \"389c450f-515f-41c5-aece-131c0fc93b04\") " pod="openshift-route-controller-manager/route-controller-manager-5cb5c6584f-7pws4" Feb 24 10:15:57 crc kubenswrapper[4985]: I0224 10:15:57.393568 4985 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2c8bfb23-9147-4fbe-bc1a-f3f5df5118e8-serving-cert\") pod \"controller-manager-f5dbb47b7-ndkwq\" (UID: \"2c8bfb23-9147-4fbe-bc1a-f3f5df5118e8\") " pod="openshift-controller-manager/controller-manager-f5dbb47b7-ndkwq" Feb 24 10:15:57 crc kubenswrapper[4985]: I0224 10:15:57.393612 4985 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2c8bfb23-9147-4fbe-bc1a-f3f5df5118e8-config\") pod \"controller-manager-f5dbb47b7-ndkwq\" (UID: \"2c8bfb23-9147-4fbe-bc1a-f3f5df5118e8\") " pod="openshift-controller-manager/controller-manager-f5dbb47b7-ndkwq" Feb 24 10:15:57 crc kubenswrapper[4985]: I0224 10:15:57.393637 4985 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/2c8bfb23-9147-4fbe-bc1a-f3f5df5118e8-proxy-ca-bundles\") pod \"controller-manager-f5dbb47b7-ndkwq\" (UID: \"2c8bfb23-9147-4fbe-bc1a-f3f5df5118e8\") " pod="openshift-controller-manager/controller-manager-f5dbb47b7-ndkwq" Feb 24 10:15:57 crc kubenswrapper[4985]: I0224 10:15:57.393684 4985 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/389c450f-515f-41c5-aece-131c0fc93b04-client-ca\") pod \"route-controller-manager-5cb5c6584f-7pws4\" (UID: \"389c450f-515f-41c5-aece-131c0fc93b04\") " pod="openshift-route-controller-manager/route-controller-manager-5cb5c6584f-7pws4" Feb 24 10:15:57 crc kubenswrapper[4985]: I0224 10:15:57.395129 4985 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/2c8bfb23-9147-4fbe-bc1a-f3f5df5118e8-client-ca\") pod \"controller-manager-f5dbb47b7-ndkwq\" (UID: \"2c8bfb23-9147-4fbe-bc1a-f3f5df5118e8\") " pod="openshift-controller-manager/controller-manager-f5dbb47b7-ndkwq" Feb 24 10:15:57 crc kubenswrapper[4985]: I0224 10:15:57.395636 4985 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/389c450f-515f-41c5-aece-131c0fc93b04-client-ca\") pod \"route-controller-manager-5cb5c6584f-7pws4\" (UID: \"389c450f-515f-41c5-aece-131c0fc93b04\") " pod="openshift-route-controller-manager/route-controller-manager-5cb5c6584f-7pws4" Feb 24 10:15:57 crc kubenswrapper[4985]: I0224 10:15:57.395728 4985 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2c8bfb23-9147-4fbe-bc1a-f3f5df5118e8-config\") pod \"controller-manager-f5dbb47b7-ndkwq\" (UID: \"2c8bfb23-9147-4fbe-bc1a-f3f5df5118e8\") " pod="openshift-controller-manager/controller-manager-f5dbb47b7-ndkwq" Feb 24 10:15:57 crc kubenswrapper[4985]: I0224 10:15:57.396069 4985 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/389c450f-515f-41c5-aece-131c0fc93b04-config\") pod \"route-controller-manager-5cb5c6584f-7pws4\" (UID: \"389c450f-515f-41c5-aece-131c0fc93b04\") " pod="openshift-route-controller-manager/route-controller-manager-5cb5c6584f-7pws4" Feb 24 10:15:57 crc kubenswrapper[4985]: I0224 10:15:57.409073 4985 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/2c8bfb23-9147-4fbe-bc1a-f3f5df5118e8-proxy-ca-bundles\") pod \"controller-manager-f5dbb47b7-ndkwq\" (UID: \"2c8bfb23-9147-4fbe-bc1a-f3f5df5118e8\") " pod="openshift-controller-manager/controller-manager-f5dbb47b7-ndkwq" Feb 24 10:15:57 crc kubenswrapper[4985]: I0224 10:15:57.409587 4985 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2c8bfb23-9147-4fbe-bc1a-f3f5df5118e8-serving-cert\") pod \"controller-manager-f5dbb47b7-ndkwq\" (UID: \"2c8bfb23-9147-4fbe-bc1a-f3f5df5118e8\") " pod="openshift-controller-manager/controller-manager-f5dbb47b7-ndkwq" Feb 24 10:15:57 crc kubenswrapper[4985]: I0224 10:15:57.409604 4985 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/389c450f-515f-41c5-aece-131c0fc93b04-serving-cert\") pod \"route-controller-manager-5cb5c6584f-7pws4\" (UID: \"389c450f-515f-41c5-aece-131c0fc93b04\") " pod="openshift-route-controller-manager/route-controller-manager-5cb5c6584f-7pws4" Feb 24 10:15:57 crc kubenswrapper[4985]: I0224 10:15:57.417832 4985 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-28vbv\" (UniqueName: \"kubernetes.io/projected/389c450f-515f-41c5-aece-131c0fc93b04-kube-api-access-28vbv\") pod \"route-controller-manager-5cb5c6584f-7pws4\" (UID: \"389c450f-515f-41c5-aece-131c0fc93b04\") " pod="openshift-route-controller-manager/route-controller-manager-5cb5c6584f-7pws4" Feb 24 10:15:57 crc kubenswrapper[4985]: I0224 10:15:57.429551 4985 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-shmfk\" (UniqueName: \"kubernetes.io/projected/2c8bfb23-9147-4fbe-bc1a-f3f5df5118e8-kube-api-access-shmfk\") pod \"controller-manager-f5dbb47b7-ndkwq\" (UID: \"2c8bfb23-9147-4fbe-bc1a-f3f5df5118e8\") " pod="openshift-controller-manager/controller-manager-f5dbb47b7-ndkwq" Feb 24 10:15:57 crc kubenswrapper[4985]: I0224 10:15:57.455283 4985 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-5cb5c6584f-7pws4" Feb 24 10:15:57 crc kubenswrapper[4985]: I0224 10:15:57.463974 4985 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-f5dbb47b7-ndkwq" Feb 24 10:15:57 crc kubenswrapper[4985]: I0224 10:15:57.877276 4985 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-f5dbb47b7-ndkwq"] Feb 24 10:15:57 crc kubenswrapper[4985]: I0224 10:15:57.920352 4985 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5cb5c6584f-7pws4"] Feb 24 10:15:57 crc kubenswrapper[4985]: W0224 10:15:57.931537 4985 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod389c450f_515f_41c5_aece_131c0fc93b04.slice/crio-e1aee4ecc27b7d3e327588754adad1fc82ccb229de762037bf864871dd7acdbf WatchSource:0}: Error finding container e1aee4ecc27b7d3e327588754adad1fc82ccb229de762037bf864871dd7acdbf: Status 404 returned error can't find the container with id e1aee4ecc27b7d3e327588754adad1fc82ccb229de762037bf864871dd7acdbf Feb 24 10:15:58 crc kubenswrapper[4985]: I0224 10:15:58.152531 4985 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-f5dbb47b7-ndkwq" event={"ID":"2c8bfb23-9147-4fbe-bc1a-f3f5df5118e8","Type":"ContainerStarted","Data":"a6768e26b9970a58a1ae47c4f93af1810b83e1a6063f616c0593123ee83757da"} Feb 24 10:15:58 crc kubenswrapper[4985]: I0224 10:15:58.152878 4985 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-f5dbb47b7-ndkwq" Feb 24 10:15:58 crc kubenswrapper[4985]: I0224 10:15:58.152915 4985 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-f5dbb47b7-ndkwq" event={"ID":"2c8bfb23-9147-4fbe-bc1a-f3f5df5118e8","Type":"ContainerStarted","Data":"22cc2929aaa87a5ce6540c92bdd11a041a49bd4c50163346d7b751f4c1ebea3b"} Feb 24 10:15:58 crc kubenswrapper[4985]: I0224 10:15:58.154291 4985 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-5cb5c6584f-7pws4" event={"ID":"389c450f-515f-41c5-aece-131c0fc93b04","Type":"ContainerStarted","Data":"4ba7432d41043a0fd144460fda596abbff7f47900f4d0eb672b69c75121f4b6f"} Feb 24 10:15:58 crc kubenswrapper[4985]: I0224 10:15:58.154331 4985 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-5cb5c6584f-7pws4" event={"ID":"389c450f-515f-41c5-aece-131c0fc93b04","Type":"ContainerStarted","Data":"e1aee4ecc27b7d3e327588754adad1fc82ccb229de762037bf864871dd7acdbf"} Feb 24 10:15:58 crc kubenswrapper[4985]: I0224 10:15:58.154473 4985 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-5cb5c6584f-7pws4" Feb 24 10:15:58 crc kubenswrapper[4985]: I0224 10:15:58.172967 4985 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-f5dbb47b7-ndkwq" Feb 24 10:15:58 crc kubenswrapper[4985]: I0224 10:15:58.179309 4985 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-f5dbb47b7-ndkwq" podStartSLOduration=3.179289242 podStartE2EDuration="3.179289242s" podCreationTimestamp="2026-02-24 10:15:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-24 10:15:58.175442014 +0000 UTC m=+442.649634594" watchObservedRunningTime="2026-02-24 10:15:58.179289242 +0000 UTC m=+442.653481802" Feb 24 10:15:58 crc kubenswrapper[4985]: I0224 10:15:58.208230 4985 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-5cb5c6584f-7pws4" podStartSLOduration=3.208211889 podStartE2EDuration="3.208211889s" podCreationTimestamp="2026-02-24 10:15:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-24 10:15:58.207303438 +0000 UTC m=+442.681495998" watchObservedRunningTime="2026-02-24 10:15:58.208211889 +0000 UTC m=+442.682404449" Feb 24 10:15:58 crc kubenswrapper[4985]: I0224 10:15:58.690677 4985 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-5cb5c6584f-7pws4" Feb 24 10:16:04 crc kubenswrapper[4985]: I0224 10:16:04.426854 4985 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-7gp42"] Feb 24 10:16:04 crc kubenswrapper[4985]: I0224 10:16:04.427987 4985 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-7gp42" Feb 24 10:16:04 crc kubenswrapper[4985]: I0224 10:16:04.442080 4985 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-7gp42"] Feb 24 10:16:04 crc kubenswrapper[4985]: I0224 10:16:04.581151 4985 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rdw86\" (UniqueName: \"kubernetes.io/projected/b6514ab8-4ac3-4374-8432-f3a637c07ae8-kube-api-access-rdw86\") pod \"image-registry-66df7c8f76-7gp42\" (UID: \"b6514ab8-4ac3-4374-8432-f3a637c07ae8\") " pod="openshift-image-registry/image-registry-66df7c8f76-7gp42" Feb 24 10:16:04 crc kubenswrapper[4985]: I0224 10:16:04.581508 4985 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/b6514ab8-4ac3-4374-8432-f3a637c07ae8-registry-certificates\") pod \"image-registry-66df7c8f76-7gp42\" (UID: \"b6514ab8-4ac3-4374-8432-f3a637c07ae8\") " pod="openshift-image-registry/image-registry-66df7c8f76-7gp42" Feb 24 10:16:04 crc kubenswrapper[4985]: I0224 10:16:04.581545 4985 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/b6514ab8-4ac3-4374-8432-f3a637c07ae8-registry-tls\") pod \"image-registry-66df7c8f76-7gp42\" (UID: \"b6514ab8-4ac3-4374-8432-f3a637c07ae8\") " pod="openshift-image-registry/image-registry-66df7c8f76-7gp42" Feb 24 10:16:04 crc kubenswrapper[4985]: I0224 10:16:04.581578 4985 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6514ab8-4ac3-4374-8432-f3a637c07ae8-trusted-ca\") pod \"image-registry-66df7c8f76-7gp42\" (UID: \"b6514ab8-4ac3-4374-8432-f3a637c07ae8\") " pod="openshift-image-registry/image-registry-66df7c8f76-7gp42" Feb 24 10:16:04 crc kubenswrapper[4985]: I0224 10:16:04.581602 4985 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/b6514ab8-4ac3-4374-8432-f3a637c07ae8-bound-sa-token\") pod \"image-registry-66df7c8f76-7gp42\" (UID: \"b6514ab8-4ac3-4374-8432-f3a637c07ae8\") " pod="openshift-image-registry/image-registry-66df7c8f76-7gp42" Feb 24 10:16:04 crc kubenswrapper[4985]: I0224 10:16:04.581647 4985 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/b6514ab8-4ac3-4374-8432-f3a637c07ae8-ca-trust-extracted\") pod \"image-registry-66df7c8f76-7gp42\" (UID: \"b6514ab8-4ac3-4374-8432-f3a637c07ae8\") " pod="openshift-image-registry/image-registry-66df7c8f76-7gp42" Feb 24 10:16:04 crc kubenswrapper[4985]: I0224 10:16:04.581667 4985 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/b6514ab8-4ac3-4374-8432-f3a637c07ae8-installation-pull-secrets\") pod \"image-registry-66df7c8f76-7gp42\" (UID: \"b6514ab8-4ac3-4374-8432-f3a637c07ae8\") " pod="openshift-image-registry/image-registry-66df7c8f76-7gp42" Feb 24 10:16:04 crc kubenswrapper[4985]: I0224 10:16:04.581868 4985 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-7gp42\" (UID: \"b6514ab8-4ac3-4374-8432-f3a637c07ae8\") " pod="openshift-image-registry/image-registry-66df7c8f76-7gp42" Feb 24 10:16:04 crc kubenswrapper[4985]: I0224 10:16:04.608784 4985 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-7gp42\" (UID: \"b6514ab8-4ac3-4374-8432-f3a637c07ae8\") " pod="openshift-image-registry/image-registry-66df7c8f76-7gp42" Feb 24 10:16:04 crc kubenswrapper[4985]: I0224 10:16:04.683230 4985 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/b6514ab8-4ac3-4374-8432-f3a637c07ae8-registry-certificates\") pod \"image-registry-66df7c8f76-7gp42\" (UID: \"b6514ab8-4ac3-4374-8432-f3a637c07ae8\") " pod="openshift-image-registry/image-registry-66df7c8f76-7gp42" Feb 24 10:16:04 crc kubenswrapper[4985]: I0224 10:16:04.683294 4985 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/b6514ab8-4ac3-4374-8432-f3a637c07ae8-registry-tls\") pod \"image-registry-66df7c8f76-7gp42\" (UID: \"b6514ab8-4ac3-4374-8432-f3a637c07ae8\") " pod="openshift-image-registry/image-registry-66df7c8f76-7gp42" Feb 24 10:16:04 crc kubenswrapper[4985]: I0224 10:16:04.683331 4985 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6514ab8-4ac3-4374-8432-f3a637c07ae8-trusted-ca\") pod \"image-registry-66df7c8f76-7gp42\" (UID: \"b6514ab8-4ac3-4374-8432-f3a637c07ae8\") " pod="openshift-image-registry/image-registry-66df7c8f76-7gp42" Feb 24 10:16:04 crc kubenswrapper[4985]: I0224 10:16:04.683358 4985 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/b6514ab8-4ac3-4374-8432-f3a637c07ae8-bound-sa-token\") pod \"image-registry-66df7c8f76-7gp42\" (UID: \"b6514ab8-4ac3-4374-8432-f3a637c07ae8\") " pod="openshift-image-registry/image-registry-66df7c8f76-7gp42" Feb 24 10:16:04 crc kubenswrapper[4985]: I0224 10:16:04.683384 4985 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/b6514ab8-4ac3-4374-8432-f3a637c07ae8-ca-trust-extracted\") pod \"image-registry-66df7c8f76-7gp42\" (UID: \"b6514ab8-4ac3-4374-8432-f3a637c07ae8\") " pod="openshift-image-registry/image-registry-66df7c8f76-7gp42" Feb 24 10:16:04 crc kubenswrapper[4985]: I0224 10:16:04.683406 4985 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/b6514ab8-4ac3-4374-8432-f3a637c07ae8-installation-pull-secrets\") pod \"image-registry-66df7c8f76-7gp42\" (UID: \"b6514ab8-4ac3-4374-8432-f3a637c07ae8\") " pod="openshift-image-registry/image-registry-66df7c8f76-7gp42" Feb 24 10:16:04 crc kubenswrapper[4985]: I0224 10:16:04.683470 4985 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rdw86\" (UniqueName: \"kubernetes.io/projected/b6514ab8-4ac3-4374-8432-f3a637c07ae8-kube-api-access-rdw86\") pod \"image-registry-66df7c8f76-7gp42\" (UID: \"b6514ab8-4ac3-4374-8432-f3a637c07ae8\") " pod="openshift-image-registry/image-registry-66df7c8f76-7gp42" Feb 24 10:16:04 crc kubenswrapper[4985]: I0224 10:16:04.683868 4985 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/b6514ab8-4ac3-4374-8432-f3a637c07ae8-ca-trust-extracted\") pod \"image-registry-66df7c8f76-7gp42\" (UID: \"b6514ab8-4ac3-4374-8432-f3a637c07ae8\") " pod="openshift-image-registry/image-registry-66df7c8f76-7gp42" Feb 24 10:16:04 crc kubenswrapper[4985]: I0224 10:16:04.685376 4985 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/b6514ab8-4ac3-4374-8432-f3a637c07ae8-registry-certificates\") pod \"image-registry-66df7c8f76-7gp42\" (UID: \"b6514ab8-4ac3-4374-8432-f3a637c07ae8\") " pod="openshift-image-registry/image-registry-66df7c8f76-7gp42" Feb 24 10:16:04 crc kubenswrapper[4985]: I0224 10:16:04.685614 4985 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6514ab8-4ac3-4374-8432-f3a637c07ae8-trusted-ca\") pod \"image-registry-66df7c8f76-7gp42\" (UID: \"b6514ab8-4ac3-4374-8432-f3a637c07ae8\") " pod="openshift-image-registry/image-registry-66df7c8f76-7gp42" Feb 24 10:16:04 crc kubenswrapper[4985]: I0224 10:16:04.690194 4985 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/b6514ab8-4ac3-4374-8432-f3a637c07ae8-registry-tls\") pod \"image-registry-66df7c8f76-7gp42\" (UID: \"b6514ab8-4ac3-4374-8432-f3a637c07ae8\") " pod="openshift-image-registry/image-registry-66df7c8f76-7gp42" Feb 24 10:16:04 crc kubenswrapper[4985]: I0224 10:16:04.697962 4985 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/b6514ab8-4ac3-4374-8432-f3a637c07ae8-installation-pull-secrets\") pod \"image-registry-66df7c8f76-7gp42\" (UID: \"b6514ab8-4ac3-4374-8432-f3a637c07ae8\") " pod="openshift-image-registry/image-registry-66df7c8f76-7gp42" Feb 24 10:16:04 crc kubenswrapper[4985]: I0224 10:16:04.699230 4985 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rdw86\" (UniqueName: \"kubernetes.io/projected/b6514ab8-4ac3-4374-8432-f3a637c07ae8-kube-api-access-rdw86\") pod \"image-registry-66df7c8f76-7gp42\" (UID: \"b6514ab8-4ac3-4374-8432-f3a637c07ae8\") " pod="openshift-image-registry/image-registry-66df7c8f76-7gp42" Feb 24 10:16:04 crc kubenswrapper[4985]: I0224 10:16:04.702465 4985 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/b6514ab8-4ac3-4374-8432-f3a637c07ae8-bound-sa-token\") pod \"image-registry-66df7c8f76-7gp42\" (UID: \"b6514ab8-4ac3-4374-8432-f3a637c07ae8\") " pod="openshift-image-registry/image-registry-66df7c8f76-7gp42" Feb 24 10:16:04 crc kubenswrapper[4985]: I0224 10:16:04.746946 4985 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-7gp42" Feb 24 10:16:05 crc kubenswrapper[4985]: I0224 10:16:05.176058 4985 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-7gp42"] Feb 24 10:16:05 crc kubenswrapper[4985]: I0224 10:16:05.189371 4985 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-7gp42" event={"ID":"b6514ab8-4ac3-4374-8432-f3a637c07ae8","Type":"ContainerStarted","Data":"5df6f2f803716b90a86d49d118fcf14425fa5bc24d24d7a6856b686a016c785f"} Feb 24 10:16:06 crc kubenswrapper[4985]: I0224 10:16:06.196077 4985 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-7gp42" event={"ID":"b6514ab8-4ac3-4374-8432-f3a637c07ae8","Type":"ContainerStarted","Data":"4ccf29e78398b9603e748b307c863a2019c67214f16924416f82e3e8d5649fe3"} Feb 24 10:16:06 crc kubenswrapper[4985]: I0224 10:16:06.197557 4985 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-66df7c8f76-7gp42" Feb 24 10:16:06 crc kubenswrapper[4985]: I0224 10:16:06.213262 4985 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-66df7c8f76-7gp42" podStartSLOduration=2.213233681 podStartE2EDuration="2.213233681s" podCreationTimestamp="2026-02-24 10:16:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-24 10:16:06.212031224 +0000 UTC m=+450.686223784" watchObservedRunningTime="2026-02-24 10:16:06.213233681 +0000 UTC m=+450.687426281" Feb 24 10:16:06 crc kubenswrapper[4985]: I0224 10:16:06.668678 4985 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-lpjfw"] Feb 24 10:16:06 crc kubenswrapper[4985]: I0224 10:16:06.669387 4985 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-lpjfw" podUID="fef137fa-cf9a-4695-a0b5-3863ec2ea3bf" containerName="registry-server" containerID="cri-o://922820f8a470e60197605565a1e12d8e604ff0a2b2c2a237f5cfad8571533d65" gracePeriod=30 Feb 24 10:16:06 crc kubenswrapper[4985]: I0224 10:16:06.674997 4985 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-l5p66"] Feb 24 10:16:06 crc kubenswrapper[4985]: I0224 10:16:06.675395 4985 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-l5p66" podUID="5186b86d-7d8f-4ed5-b444-991eaf2a793e" containerName="registry-server" containerID="cri-o://c7fe22917b7d142d3fe274699e2f1f5138e0263bfd36f6a94aff95ad95eb8a5a" gracePeriod=30 Feb 24 10:16:06 crc kubenswrapper[4985]: I0224 10:16:06.699375 4985 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-rlq9j"] Feb 24 10:16:06 crc kubenswrapper[4985]: I0224 10:16:06.699635 4985 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/marketplace-operator-79b997595-rlq9j" podUID="6ef771ef-28ac-46c6-925e-f12a7a70b6c3" containerName="marketplace-operator" containerID="cri-o://3642100016b8d9597fa4ebee463d4b2a820e327e6bcd36dd9448cc560a2e42b7" gracePeriod=30 Feb 24 10:16:06 crc kubenswrapper[4985]: I0224 10:16:06.703096 4985 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-ptllg"] Feb 24 10:16:06 crc kubenswrapper[4985]: I0224 10:16:06.703386 4985 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-ptllg" podUID="d8f7b4e0-a896-4c14-bfea-fa6066bfd4c4" containerName="registry-server" containerID="cri-o://a694ca5350c8dd87df4e719af409ee5abaa11bfb56d6b48a81555580033949be" gracePeriod=30 Feb 24 10:16:06 crc kubenswrapper[4985]: I0224 10:16:06.705578 4985 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-qqb8n"] Feb 24 10:16:06 crc kubenswrapper[4985]: I0224 10:16:06.709612 4985 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-qqb8n" Feb 24 10:16:06 crc kubenswrapper[4985]: I0224 10:16:06.724506 4985 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-qqb8n"] Feb 24 10:16:06 crc kubenswrapper[4985]: I0224 10:16:06.729684 4985 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-rhlvp"] Feb 24 10:16:06 crc kubenswrapper[4985]: I0224 10:16:06.730001 4985 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-rhlvp" podUID="2fd0e2bc-ed7e-4a38-b107-9217d349ad15" containerName="registry-server" containerID="cri-o://09e4a5fca1f0926f07129896234ee5043d7c6da9fd3c83fd99e821c6979a7f1e" gracePeriod=30 Feb 24 10:16:06 crc kubenswrapper[4985]: I0224 10:16:06.814139 4985 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b3c8bb1e-35de-478b-81fd-ff6dda676508-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-qqb8n\" (UID: \"b3c8bb1e-35de-478b-81fd-ff6dda676508\") " pod="openshift-marketplace/marketplace-operator-79b997595-qqb8n" Feb 24 10:16:06 crc kubenswrapper[4985]: I0224 10:16:06.814186 4985 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b3c8bb1e-35de-478b-81fd-ff6dda676508-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-qqb8n\" (UID: \"b3c8bb1e-35de-478b-81fd-ff6dda676508\") " pod="openshift-marketplace/marketplace-operator-79b997595-qqb8n" Feb 24 10:16:06 crc kubenswrapper[4985]: I0224 10:16:06.814230 4985 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c9d4v\" (UniqueName: \"kubernetes.io/projected/b3c8bb1e-35de-478b-81fd-ff6dda676508-kube-api-access-c9d4v\") pod \"marketplace-operator-79b997595-qqb8n\" (UID: \"b3c8bb1e-35de-478b-81fd-ff6dda676508\") " pod="openshift-marketplace/marketplace-operator-79b997595-qqb8n" Feb 24 10:16:06 crc kubenswrapper[4985]: I0224 10:16:06.915415 4985 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b3c8bb1e-35de-478b-81fd-ff6dda676508-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-qqb8n\" (UID: \"b3c8bb1e-35de-478b-81fd-ff6dda676508\") " pod="openshift-marketplace/marketplace-operator-79b997595-qqb8n" Feb 24 10:16:06 crc kubenswrapper[4985]: I0224 10:16:06.915469 4985 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b3c8bb1e-35de-478b-81fd-ff6dda676508-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-qqb8n\" (UID: \"b3c8bb1e-35de-478b-81fd-ff6dda676508\") " pod="openshift-marketplace/marketplace-operator-79b997595-qqb8n" Feb 24 10:16:06 crc kubenswrapper[4985]: I0224 10:16:06.915526 4985 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c9d4v\" (UniqueName: \"kubernetes.io/projected/b3c8bb1e-35de-478b-81fd-ff6dda676508-kube-api-access-c9d4v\") pod \"marketplace-operator-79b997595-qqb8n\" (UID: \"b3c8bb1e-35de-478b-81fd-ff6dda676508\") " pod="openshift-marketplace/marketplace-operator-79b997595-qqb8n" Feb 24 10:16:06 crc kubenswrapper[4985]: I0224 10:16:06.916812 4985 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b3c8bb1e-35de-478b-81fd-ff6dda676508-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-qqb8n\" (UID: \"b3c8bb1e-35de-478b-81fd-ff6dda676508\") " pod="openshift-marketplace/marketplace-operator-79b997595-qqb8n" Feb 24 10:16:06 crc kubenswrapper[4985]: I0224 10:16:06.926267 4985 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b3c8bb1e-35de-478b-81fd-ff6dda676508-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-qqb8n\" (UID: \"b3c8bb1e-35de-478b-81fd-ff6dda676508\") " pod="openshift-marketplace/marketplace-operator-79b997595-qqb8n" Feb 24 10:16:06 crc kubenswrapper[4985]: I0224 10:16:06.933771 4985 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c9d4v\" (UniqueName: \"kubernetes.io/projected/b3c8bb1e-35de-478b-81fd-ff6dda676508-kube-api-access-c9d4v\") pod \"marketplace-operator-79b997595-qqb8n\" (UID: \"b3c8bb1e-35de-478b-81fd-ff6dda676508\") " pod="openshift-marketplace/marketplace-operator-79b997595-qqb8n" Feb 24 10:16:07 crc kubenswrapper[4985]: I0224 10:16:07.043559 4985 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-qqb8n" Feb 24 10:16:07 crc kubenswrapper[4985]: I0224 10:16:07.180379 4985 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-lpjfw" Feb 24 10:16:07 crc kubenswrapper[4985]: I0224 10:16:07.206357 4985 generic.go:334] "Generic (PLEG): container finished" podID="5186b86d-7d8f-4ed5-b444-991eaf2a793e" containerID="c7fe22917b7d142d3fe274699e2f1f5138e0263bfd36f6a94aff95ad95eb8a5a" exitCode=0 Feb 24 10:16:07 crc kubenswrapper[4985]: I0224 10:16:07.206419 4985 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-l5p66" event={"ID":"5186b86d-7d8f-4ed5-b444-991eaf2a793e","Type":"ContainerDied","Data":"c7fe22917b7d142d3fe274699e2f1f5138e0263bfd36f6a94aff95ad95eb8a5a"} Feb 24 10:16:07 crc kubenswrapper[4985]: I0224 10:16:07.209226 4985 generic.go:334] "Generic (PLEG): container finished" podID="fef137fa-cf9a-4695-a0b5-3863ec2ea3bf" containerID="922820f8a470e60197605565a1e12d8e604ff0a2b2c2a237f5cfad8571533d65" exitCode=0 Feb 24 10:16:07 crc kubenswrapper[4985]: I0224 10:16:07.209307 4985 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-lpjfw" event={"ID":"fef137fa-cf9a-4695-a0b5-3863ec2ea3bf","Type":"ContainerDied","Data":"922820f8a470e60197605565a1e12d8e604ff0a2b2c2a237f5cfad8571533d65"} Feb 24 10:16:07 crc kubenswrapper[4985]: I0224 10:16:07.209325 4985 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-lpjfw" Feb 24 10:16:07 crc kubenswrapper[4985]: I0224 10:16:07.209331 4985 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-lpjfw" event={"ID":"fef137fa-cf9a-4695-a0b5-3863ec2ea3bf","Type":"ContainerDied","Data":"e2e9add4855b0e37590ea0f1d33c498f6b79987a891a5f60c7afa3e8c22742fc"} Feb 24 10:16:07 crc kubenswrapper[4985]: I0224 10:16:07.209343 4985 scope.go:117] "RemoveContainer" containerID="922820f8a470e60197605565a1e12d8e604ff0a2b2c2a237f5cfad8571533d65" Feb 24 10:16:07 crc kubenswrapper[4985]: I0224 10:16:07.215295 4985 generic.go:334] "Generic (PLEG): container finished" podID="d8f7b4e0-a896-4c14-bfea-fa6066bfd4c4" containerID="a694ca5350c8dd87df4e719af409ee5abaa11bfb56d6b48a81555580033949be" exitCode=0 Feb 24 10:16:07 crc kubenswrapper[4985]: I0224 10:16:07.215377 4985 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ptllg" event={"ID":"d8f7b4e0-a896-4c14-bfea-fa6066bfd4c4","Type":"ContainerDied","Data":"a694ca5350c8dd87df4e719af409ee5abaa11bfb56d6b48a81555580033949be"} Feb 24 10:16:07 crc kubenswrapper[4985]: I0224 10:16:07.217223 4985 generic.go:334] "Generic (PLEG): container finished" podID="6ef771ef-28ac-46c6-925e-f12a7a70b6c3" containerID="3642100016b8d9597fa4ebee463d4b2a820e327e6bcd36dd9448cc560a2e42b7" exitCode=0 Feb 24 10:16:07 crc kubenswrapper[4985]: I0224 10:16:07.217279 4985 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-rlq9j" event={"ID":"6ef771ef-28ac-46c6-925e-f12a7a70b6c3","Type":"ContainerDied","Data":"3642100016b8d9597fa4ebee463d4b2a820e327e6bcd36dd9448cc560a2e42b7"} Feb 24 10:16:07 crc kubenswrapper[4985]: I0224 10:16:07.241115 4985 scope.go:117] "RemoveContainer" containerID="4862e7f12d77b140f7ab5f7af947a0014e846f221caf8476a6ca906b53069d69" Feb 24 10:16:07 crc kubenswrapper[4985]: I0224 10:16:07.241282 4985 generic.go:334] "Generic (PLEG): container finished" podID="2fd0e2bc-ed7e-4a38-b107-9217d349ad15" containerID="09e4a5fca1f0926f07129896234ee5043d7c6da9fd3c83fd99e821c6979a7f1e" exitCode=0 Feb 24 10:16:07 crc kubenswrapper[4985]: I0224 10:16:07.241393 4985 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rhlvp" event={"ID":"2fd0e2bc-ed7e-4a38-b107-9217d349ad15","Type":"ContainerDied","Data":"09e4a5fca1f0926f07129896234ee5043d7c6da9fd3c83fd99e821c6979a7f1e"} Feb 24 10:16:07 crc kubenswrapper[4985]: I0224 10:16:07.266378 4985 scope.go:117] "RemoveContainer" containerID="f748cadb2c14af032b903bc46576a2f812070585caecce417f7d6f2d5453c570" Feb 24 10:16:07 crc kubenswrapper[4985]: I0224 10:16:07.283875 4985 scope.go:117] "RemoveContainer" containerID="922820f8a470e60197605565a1e12d8e604ff0a2b2c2a237f5cfad8571533d65" Feb 24 10:16:07 crc kubenswrapper[4985]: E0224 10:16:07.284450 4985 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"922820f8a470e60197605565a1e12d8e604ff0a2b2c2a237f5cfad8571533d65\": container with ID starting with 922820f8a470e60197605565a1e12d8e604ff0a2b2c2a237f5cfad8571533d65 not found: ID does not exist" containerID="922820f8a470e60197605565a1e12d8e604ff0a2b2c2a237f5cfad8571533d65" Feb 24 10:16:07 crc kubenswrapper[4985]: I0224 10:16:07.284480 4985 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"922820f8a470e60197605565a1e12d8e604ff0a2b2c2a237f5cfad8571533d65"} err="failed to get container status \"922820f8a470e60197605565a1e12d8e604ff0a2b2c2a237f5cfad8571533d65\": rpc error: code = NotFound desc = could not find container \"922820f8a470e60197605565a1e12d8e604ff0a2b2c2a237f5cfad8571533d65\": container with ID starting with 922820f8a470e60197605565a1e12d8e604ff0a2b2c2a237f5cfad8571533d65 not found: ID does not exist" Feb 24 10:16:07 crc kubenswrapper[4985]: I0224 10:16:07.284503 4985 scope.go:117] "RemoveContainer" containerID="4862e7f12d77b140f7ab5f7af947a0014e846f221caf8476a6ca906b53069d69" Feb 24 10:16:07 crc kubenswrapper[4985]: E0224 10:16:07.284741 4985 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4862e7f12d77b140f7ab5f7af947a0014e846f221caf8476a6ca906b53069d69\": container with ID starting with 4862e7f12d77b140f7ab5f7af947a0014e846f221caf8476a6ca906b53069d69 not found: ID does not exist" containerID="4862e7f12d77b140f7ab5f7af947a0014e846f221caf8476a6ca906b53069d69" Feb 24 10:16:07 crc kubenswrapper[4985]: I0224 10:16:07.284762 4985 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4862e7f12d77b140f7ab5f7af947a0014e846f221caf8476a6ca906b53069d69"} err="failed to get container status \"4862e7f12d77b140f7ab5f7af947a0014e846f221caf8476a6ca906b53069d69\": rpc error: code = NotFound desc = could not find container \"4862e7f12d77b140f7ab5f7af947a0014e846f221caf8476a6ca906b53069d69\": container with ID starting with 4862e7f12d77b140f7ab5f7af947a0014e846f221caf8476a6ca906b53069d69 not found: ID does not exist" Feb 24 10:16:07 crc kubenswrapper[4985]: I0224 10:16:07.284773 4985 scope.go:117] "RemoveContainer" containerID="f748cadb2c14af032b903bc46576a2f812070585caecce417f7d6f2d5453c570" Feb 24 10:16:07 crc kubenswrapper[4985]: E0224 10:16:07.285018 4985 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f748cadb2c14af032b903bc46576a2f812070585caecce417f7d6f2d5453c570\": container with ID starting with f748cadb2c14af032b903bc46576a2f812070585caecce417f7d6f2d5453c570 not found: ID does not exist" containerID="f748cadb2c14af032b903bc46576a2f812070585caecce417f7d6f2d5453c570" Feb 24 10:16:07 crc kubenswrapper[4985]: I0224 10:16:07.285035 4985 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f748cadb2c14af032b903bc46576a2f812070585caecce417f7d6f2d5453c570"} err="failed to get container status \"f748cadb2c14af032b903bc46576a2f812070585caecce417f7d6f2d5453c570\": rpc error: code = NotFound desc = could not find container \"f748cadb2c14af032b903bc46576a2f812070585caecce417f7d6f2d5453c570\": container with ID starting with f748cadb2c14af032b903bc46576a2f812070585caecce417f7d6f2d5453c570 not found: ID does not exist" Feb 24 10:16:07 crc kubenswrapper[4985]: I0224 10:16:07.285101 4985 scope.go:117] "RemoveContainer" containerID="c094ee19a2d18359139a745230e588b4a99d5fd2e567eff67655a43342efae0f" Feb 24 10:16:07 crc kubenswrapper[4985]: I0224 10:16:07.343391 4985 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fef137fa-cf9a-4695-a0b5-3863ec2ea3bf-catalog-content\") pod \"fef137fa-cf9a-4695-a0b5-3863ec2ea3bf\" (UID: \"fef137fa-cf9a-4695-a0b5-3863ec2ea3bf\") " Feb 24 10:16:07 crc kubenswrapper[4985]: I0224 10:16:07.343470 4985 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mp2f6\" (UniqueName: \"kubernetes.io/projected/fef137fa-cf9a-4695-a0b5-3863ec2ea3bf-kube-api-access-mp2f6\") pod \"fef137fa-cf9a-4695-a0b5-3863ec2ea3bf\" (UID: \"fef137fa-cf9a-4695-a0b5-3863ec2ea3bf\") " Feb 24 10:16:07 crc kubenswrapper[4985]: I0224 10:16:07.343524 4985 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fef137fa-cf9a-4695-a0b5-3863ec2ea3bf-utilities\") pod \"fef137fa-cf9a-4695-a0b5-3863ec2ea3bf\" (UID: \"fef137fa-cf9a-4695-a0b5-3863ec2ea3bf\") " Feb 24 10:16:07 crc kubenswrapper[4985]: I0224 10:16:07.352652 4985 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fef137fa-cf9a-4695-a0b5-3863ec2ea3bf-utilities" (OuterVolumeSpecName: "utilities") pod "fef137fa-cf9a-4695-a0b5-3863ec2ea3bf" (UID: "fef137fa-cf9a-4695-a0b5-3863ec2ea3bf"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 24 10:16:07 crc kubenswrapper[4985]: I0224 10:16:07.355037 4985 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fef137fa-cf9a-4695-a0b5-3863ec2ea3bf-kube-api-access-mp2f6" (OuterVolumeSpecName: "kube-api-access-mp2f6") pod "fef137fa-cf9a-4695-a0b5-3863ec2ea3bf" (UID: "fef137fa-cf9a-4695-a0b5-3863ec2ea3bf"). InnerVolumeSpecName "kube-api-access-mp2f6". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 24 10:16:07 crc kubenswrapper[4985]: I0224 10:16:07.398290 4985 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-ptllg" Feb 24 10:16:07 crc kubenswrapper[4985]: I0224 10:16:07.414867 4985 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-rhlvp" Feb 24 10:16:07 crc kubenswrapper[4985]: I0224 10:16:07.418448 4985 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fef137fa-cf9a-4695-a0b5-3863ec2ea3bf-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "fef137fa-cf9a-4695-a0b5-3863ec2ea3bf" (UID: "fef137fa-cf9a-4695-a0b5-3863ec2ea3bf"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 24 10:16:07 crc kubenswrapper[4985]: I0224 10:16:07.445146 4985 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fef137fa-cf9a-4695-a0b5-3863ec2ea3bf-utilities\") on node \"crc\" DevicePath \"\"" Feb 24 10:16:07 crc kubenswrapper[4985]: I0224 10:16:07.445175 4985 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fef137fa-cf9a-4695-a0b5-3863ec2ea3bf-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 24 10:16:07 crc kubenswrapper[4985]: I0224 10:16:07.445185 4985 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mp2f6\" (UniqueName: \"kubernetes.io/projected/fef137fa-cf9a-4695-a0b5-3863ec2ea3bf-kube-api-access-mp2f6\") on node \"crc\" DevicePath \"\"" Feb 24 10:16:07 crc kubenswrapper[4985]: I0224 10:16:07.539980 4985 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-qqb8n"] Feb 24 10:16:07 crc kubenswrapper[4985]: I0224 10:16:07.546213 4985 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-lpjfw"] Feb 24 10:16:07 crc kubenswrapper[4985]: I0224 10:16:07.546740 4985 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2fd0e2bc-ed7e-4a38-b107-9217d349ad15-utilities\") pod \"2fd0e2bc-ed7e-4a38-b107-9217d349ad15\" (UID: \"2fd0e2bc-ed7e-4a38-b107-9217d349ad15\") " Feb 24 10:16:07 crc kubenswrapper[4985]: I0224 10:16:07.546834 4985 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2psfq\" (UniqueName: \"kubernetes.io/projected/2fd0e2bc-ed7e-4a38-b107-9217d349ad15-kube-api-access-2psfq\") pod \"2fd0e2bc-ed7e-4a38-b107-9217d349ad15\" (UID: \"2fd0e2bc-ed7e-4a38-b107-9217d349ad15\") " Feb 24 10:16:07 crc kubenswrapper[4985]: I0224 10:16:07.546939 4985 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d8f7b4e0-a896-4c14-bfea-fa6066bfd4c4-catalog-content\") pod \"d8f7b4e0-a896-4c14-bfea-fa6066bfd4c4\" (UID: \"d8f7b4e0-a896-4c14-bfea-fa6066bfd4c4\") " Feb 24 10:16:07 crc kubenswrapper[4985]: I0224 10:16:07.547073 4985 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d8f7b4e0-a896-4c14-bfea-fa6066bfd4c4-utilities\") pod \"d8f7b4e0-a896-4c14-bfea-fa6066bfd4c4\" (UID: \"d8f7b4e0-a896-4c14-bfea-fa6066bfd4c4\") " Feb 24 10:16:07 crc kubenswrapper[4985]: I0224 10:16:07.547156 4985 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2fd0e2bc-ed7e-4a38-b107-9217d349ad15-catalog-content\") pod \"2fd0e2bc-ed7e-4a38-b107-9217d349ad15\" (UID: \"2fd0e2bc-ed7e-4a38-b107-9217d349ad15\") " Feb 24 10:16:07 crc kubenswrapper[4985]: I0224 10:16:07.547227 4985 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6s6dj\" (UniqueName: \"kubernetes.io/projected/d8f7b4e0-a896-4c14-bfea-fa6066bfd4c4-kube-api-access-6s6dj\") pod \"d8f7b4e0-a896-4c14-bfea-fa6066bfd4c4\" (UID: \"d8f7b4e0-a896-4c14-bfea-fa6066bfd4c4\") " Feb 24 10:16:07 crc kubenswrapper[4985]: I0224 10:16:07.547473 4985 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2fd0e2bc-ed7e-4a38-b107-9217d349ad15-utilities" (OuterVolumeSpecName: "utilities") pod "2fd0e2bc-ed7e-4a38-b107-9217d349ad15" (UID: "2fd0e2bc-ed7e-4a38-b107-9217d349ad15"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 24 10:16:07 crc kubenswrapper[4985]: I0224 10:16:07.547742 4985 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2fd0e2bc-ed7e-4a38-b107-9217d349ad15-utilities\") on node \"crc\" DevicePath \"\"" Feb 24 10:16:07 crc kubenswrapper[4985]: I0224 10:16:07.547910 4985 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d8f7b4e0-a896-4c14-bfea-fa6066bfd4c4-utilities" (OuterVolumeSpecName: "utilities") pod "d8f7b4e0-a896-4c14-bfea-fa6066bfd4c4" (UID: "d8f7b4e0-a896-4c14-bfea-fa6066bfd4c4"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 24 10:16:07 crc kubenswrapper[4985]: I0224 10:16:07.550546 4985 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-lpjfw"] Feb 24 10:16:07 crc kubenswrapper[4985]: I0224 10:16:07.551435 4985 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d8f7b4e0-a896-4c14-bfea-fa6066bfd4c4-kube-api-access-6s6dj" (OuterVolumeSpecName: "kube-api-access-6s6dj") pod "d8f7b4e0-a896-4c14-bfea-fa6066bfd4c4" (UID: "d8f7b4e0-a896-4c14-bfea-fa6066bfd4c4"). InnerVolumeSpecName "kube-api-access-6s6dj". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 24 10:16:07 crc kubenswrapper[4985]: I0224 10:16:07.566009 4985 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2fd0e2bc-ed7e-4a38-b107-9217d349ad15-kube-api-access-2psfq" (OuterVolumeSpecName: "kube-api-access-2psfq") pod "2fd0e2bc-ed7e-4a38-b107-9217d349ad15" (UID: "2fd0e2bc-ed7e-4a38-b107-9217d349ad15"). InnerVolumeSpecName "kube-api-access-2psfq". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 24 10:16:07 crc kubenswrapper[4985]: I0224 10:16:07.571281 4985 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d8f7b4e0-a896-4c14-bfea-fa6066bfd4c4-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "d8f7b4e0-a896-4c14-bfea-fa6066bfd4c4" (UID: "d8f7b4e0-a896-4c14-bfea-fa6066bfd4c4"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 24 10:16:07 crc kubenswrapper[4985]: W0224 10:16:07.583603 4985 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb3c8bb1e_35de_478b_81fd_ff6dda676508.slice/crio-8ee79219539e990e3df2bb9923d259942b3e17d60a98893fe94fa9007b63aad7 WatchSource:0}: Error finding container 8ee79219539e990e3df2bb9923d259942b3e17d60a98893fe94fa9007b63aad7: Status 404 returned error can't find the container with id 8ee79219539e990e3df2bb9923d259942b3e17d60a98893fe94fa9007b63aad7 Feb 24 10:16:07 crc kubenswrapper[4985]: I0224 10:16:07.649158 4985 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2psfq\" (UniqueName: \"kubernetes.io/projected/2fd0e2bc-ed7e-4a38-b107-9217d349ad15-kube-api-access-2psfq\") on node \"crc\" DevicePath \"\"" Feb 24 10:16:07 crc kubenswrapper[4985]: I0224 10:16:07.649445 4985 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d8f7b4e0-a896-4c14-bfea-fa6066bfd4c4-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 24 10:16:07 crc kubenswrapper[4985]: I0224 10:16:07.649456 4985 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d8f7b4e0-a896-4c14-bfea-fa6066bfd4c4-utilities\") on node \"crc\" DevicePath \"\"" Feb 24 10:16:07 crc kubenswrapper[4985]: I0224 10:16:07.649465 4985 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6s6dj\" (UniqueName: \"kubernetes.io/projected/d8f7b4e0-a896-4c14-bfea-fa6066bfd4c4-kube-api-access-6s6dj\") on node \"crc\" DevicePath \"\"" Feb 24 10:16:07 crc kubenswrapper[4985]: I0224 10:16:07.713964 4985 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2fd0e2bc-ed7e-4a38-b107-9217d349ad15-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "2fd0e2bc-ed7e-4a38-b107-9217d349ad15" (UID: "2fd0e2bc-ed7e-4a38-b107-9217d349ad15"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 24 10:16:07 crc kubenswrapper[4985]: I0224 10:16:07.738429 4985 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-l5p66" Feb 24 10:16:07 crc kubenswrapper[4985]: I0224 10:16:07.753464 4985 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2fd0e2bc-ed7e-4a38-b107-9217d349ad15-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 24 10:16:07 crc kubenswrapper[4985]: I0224 10:16:07.793802 4985 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-rlq9j" Feb 24 10:16:07 crc kubenswrapper[4985]: I0224 10:16:07.854262 4985 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jrwz6\" (UniqueName: \"kubernetes.io/projected/5186b86d-7d8f-4ed5-b444-991eaf2a793e-kube-api-access-jrwz6\") pod \"5186b86d-7d8f-4ed5-b444-991eaf2a793e\" (UID: \"5186b86d-7d8f-4ed5-b444-991eaf2a793e\") " Feb 24 10:16:07 crc kubenswrapper[4985]: I0224 10:16:07.854318 4985 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5186b86d-7d8f-4ed5-b444-991eaf2a793e-catalog-content\") pod \"5186b86d-7d8f-4ed5-b444-991eaf2a793e\" (UID: \"5186b86d-7d8f-4ed5-b444-991eaf2a793e\") " Feb 24 10:16:07 crc kubenswrapper[4985]: I0224 10:16:07.854417 4985 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5186b86d-7d8f-4ed5-b444-991eaf2a793e-utilities\") pod \"5186b86d-7d8f-4ed5-b444-991eaf2a793e\" (UID: \"5186b86d-7d8f-4ed5-b444-991eaf2a793e\") " Feb 24 10:16:07 crc kubenswrapper[4985]: I0224 10:16:07.857997 4985 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5186b86d-7d8f-4ed5-b444-991eaf2a793e-kube-api-access-jrwz6" (OuterVolumeSpecName: "kube-api-access-jrwz6") pod "5186b86d-7d8f-4ed5-b444-991eaf2a793e" (UID: "5186b86d-7d8f-4ed5-b444-991eaf2a793e"). InnerVolumeSpecName "kube-api-access-jrwz6". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 24 10:16:07 crc kubenswrapper[4985]: I0224 10:16:07.866707 4985 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5186b86d-7d8f-4ed5-b444-991eaf2a793e-utilities" (OuterVolumeSpecName: "utilities") pod "5186b86d-7d8f-4ed5-b444-991eaf2a793e" (UID: "5186b86d-7d8f-4ed5-b444-991eaf2a793e"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 24 10:16:07 crc kubenswrapper[4985]: I0224 10:16:07.905088 4985 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5186b86d-7d8f-4ed5-b444-991eaf2a793e-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5186b86d-7d8f-4ed5-b444-991eaf2a793e" (UID: "5186b86d-7d8f-4ed5-b444-991eaf2a793e"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 24 10:16:07 crc kubenswrapper[4985]: I0224 10:16:07.956286 4985 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/6ef771ef-28ac-46c6-925e-f12a7a70b6c3-marketplace-trusted-ca\") pod \"6ef771ef-28ac-46c6-925e-f12a7a70b6c3\" (UID: \"6ef771ef-28ac-46c6-925e-f12a7a70b6c3\") " Feb 24 10:16:07 crc kubenswrapper[4985]: I0224 10:16:07.956360 4985 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/6ef771ef-28ac-46c6-925e-f12a7a70b6c3-marketplace-operator-metrics\") pod \"6ef771ef-28ac-46c6-925e-f12a7a70b6c3\" (UID: \"6ef771ef-28ac-46c6-925e-f12a7a70b6c3\") " Feb 24 10:16:07 crc kubenswrapper[4985]: I0224 10:16:07.956459 4985 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8wmsl\" (UniqueName: \"kubernetes.io/projected/6ef771ef-28ac-46c6-925e-f12a7a70b6c3-kube-api-access-8wmsl\") pod \"6ef771ef-28ac-46c6-925e-f12a7a70b6c3\" (UID: \"6ef771ef-28ac-46c6-925e-f12a7a70b6c3\") " Feb 24 10:16:07 crc kubenswrapper[4985]: I0224 10:16:07.956709 4985 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5186b86d-7d8f-4ed5-b444-991eaf2a793e-utilities\") on node \"crc\" DevicePath \"\"" Feb 24 10:16:07 crc kubenswrapper[4985]: I0224 10:16:07.956742 4985 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jrwz6\" (UniqueName: \"kubernetes.io/projected/5186b86d-7d8f-4ed5-b444-991eaf2a793e-kube-api-access-jrwz6\") on node \"crc\" DevicePath \"\"" Feb 24 10:16:07 crc kubenswrapper[4985]: I0224 10:16:07.956756 4985 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5186b86d-7d8f-4ed5-b444-991eaf2a793e-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 24 10:16:07 crc kubenswrapper[4985]: I0224 10:16:07.957716 4985 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ef771ef-28ac-46c6-925e-f12a7a70b6c3-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "6ef771ef-28ac-46c6-925e-f12a7a70b6c3" (UID: "6ef771ef-28ac-46c6-925e-f12a7a70b6c3"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 24 10:16:07 crc kubenswrapper[4985]: I0224 10:16:07.959985 4985 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ef771ef-28ac-46c6-925e-f12a7a70b6c3-kube-api-access-8wmsl" (OuterVolumeSpecName: "kube-api-access-8wmsl") pod "6ef771ef-28ac-46c6-925e-f12a7a70b6c3" (UID: "6ef771ef-28ac-46c6-925e-f12a7a70b6c3"). InnerVolumeSpecName "kube-api-access-8wmsl". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 24 10:16:07 crc kubenswrapper[4985]: I0224 10:16:07.960192 4985 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ef771ef-28ac-46c6-925e-f12a7a70b6c3-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "6ef771ef-28ac-46c6-925e-f12a7a70b6c3" (UID: "6ef771ef-28ac-46c6-925e-f12a7a70b6c3"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 24 10:16:08 crc kubenswrapper[4985]: I0224 10:16:08.058168 4985 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/6ef771ef-28ac-46c6-925e-f12a7a70b6c3-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Feb 24 10:16:08 crc kubenswrapper[4985]: I0224 10:16:08.058202 4985 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/6ef771ef-28ac-46c6-925e-f12a7a70b6c3-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Feb 24 10:16:08 crc kubenswrapper[4985]: I0224 10:16:08.058214 4985 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8wmsl\" (UniqueName: \"kubernetes.io/projected/6ef771ef-28ac-46c6-925e-f12a7a70b6c3-kube-api-access-8wmsl\") on node \"crc\" DevicePath \"\"" Feb 24 10:16:08 crc kubenswrapper[4985]: I0224 10:16:08.247669 4985 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-l5p66" Feb 24 10:16:08 crc kubenswrapper[4985]: I0224 10:16:08.249238 4985 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-l5p66" event={"ID":"5186b86d-7d8f-4ed5-b444-991eaf2a793e","Type":"ContainerDied","Data":"c407de72161d712288ff2dbbd966ca1b6238e0b9bdc50c5086e8a7d35cd8bf94"} Feb 24 10:16:08 crc kubenswrapper[4985]: I0224 10:16:08.249295 4985 scope.go:117] "RemoveContainer" containerID="c7fe22917b7d142d3fe274699e2f1f5138e0263bfd36f6a94aff95ad95eb8a5a" Feb 24 10:16:08 crc kubenswrapper[4985]: I0224 10:16:08.252336 4985 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ptllg" event={"ID":"d8f7b4e0-a896-4c14-bfea-fa6066bfd4c4","Type":"ContainerDied","Data":"39001a38ca7d0ccc7dd63480281feff1af039b99ab2bca450a88ad395117d291"} Feb 24 10:16:08 crc kubenswrapper[4985]: I0224 10:16:08.252441 4985 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-ptllg" Feb 24 10:16:08 crc kubenswrapper[4985]: I0224 10:16:08.257440 4985 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-qqb8n" event={"ID":"b3c8bb1e-35de-478b-81fd-ff6dda676508","Type":"ContainerStarted","Data":"3b432bb9970decc2d6c5fe05918ef2cdb42fbf209d6bb3aecc8dc2dce379e5ed"} Feb 24 10:16:08 crc kubenswrapper[4985]: I0224 10:16:08.257477 4985 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-qqb8n" event={"ID":"b3c8bb1e-35de-478b-81fd-ff6dda676508","Type":"ContainerStarted","Data":"8ee79219539e990e3df2bb9923d259942b3e17d60a98893fe94fa9007b63aad7"} Feb 24 10:16:08 crc kubenswrapper[4985]: I0224 10:16:08.258254 4985 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-qqb8n" Feb 24 10:16:08 crc kubenswrapper[4985]: I0224 10:16:08.261537 4985 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-rlq9j" Feb 24 10:16:08 crc kubenswrapper[4985]: I0224 10:16:08.261580 4985 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-rlq9j" event={"ID":"6ef771ef-28ac-46c6-925e-f12a7a70b6c3","Type":"ContainerDied","Data":"dc31b55ac10c29aca78eb19bd76ffbaaf12fb4838d2dfa04b79f36a9cce91631"} Feb 24 10:16:08 crc kubenswrapper[4985]: I0224 10:16:08.262872 4985 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-qqb8n" Feb 24 10:16:08 crc kubenswrapper[4985]: I0224 10:16:08.264815 4985 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-rhlvp" Feb 24 10:16:08 crc kubenswrapper[4985]: I0224 10:16:08.272920 4985 scope.go:117] "RemoveContainer" containerID="53ac39259d9a89039900e4398d48a67fb894ffcce97799faa2fde54b8ed04874" Feb 24 10:16:08 crc kubenswrapper[4985]: I0224 10:16:08.278205 4985 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-qqb8n" podStartSLOduration=2.278186374 podStartE2EDuration="2.278186374s" podCreationTimestamp="2026-02-24 10:16:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-24 10:16:08.274811247 +0000 UTC m=+452.749003807" watchObservedRunningTime="2026-02-24 10:16:08.278186374 +0000 UTC m=+452.752378934" Feb 24 10:16:08 crc kubenswrapper[4985]: I0224 10:16:08.284863 4985 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fef137fa-cf9a-4695-a0b5-3863ec2ea3bf" path="/var/lib/kubelet/pods/fef137fa-cf9a-4695-a0b5-3863ec2ea3bf/volumes" Feb 24 10:16:08 crc kubenswrapper[4985]: I0224 10:16:08.285746 4985 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rhlvp" event={"ID":"2fd0e2bc-ed7e-4a38-b107-9217d349ad15","Type":"ContainerDied","Data":"460fef235535f28e1092b9e0a0826815c7212707e44e41e1eb19d0a0d6840616"} Feb 24 10:16:08 crc kubenswrapper[4985]: I0224 10:16:08.299174 4985 scope.go:117] "RemoveContainer" containerID="8afe751e1cca698fb0d943baa943410c58c42409ca1e1eaa81771f5e6aaaa4bd" Feb 24 10:16:08 crc kubenswrapper[4985]: I0224 10:16:08.323512 4985 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-l5p66"] Feb 24 10:16:08 crc kubenswrapper[4985]: I0224 10:16:08.324026 4985 scope.go:117] "RemoveContainer" containerID="a694ca5350c8dd87df4e719af409ee5abaa11bfb56d6b48a81555580033949be" Feb 24 10:16:08 crc kubenswrapper[4985]: I0224 10:16:08.328591 4985 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-l5p66"] Feb 24 10:16:08 crc kubenswrapper[4985]: I0224 10:16:08.340352 4985 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-rlq9j"] Feb 24 10:16:08 crc kubenswrapper[4985]: I0224 10:16:08.346670 4985 scope.go:117] "RemoveContainer" containerID="5441a26ac037a837c07fa4b99a47e59ed6b248f18309c29a5888aeebf0b65e8a" Feb 24 10:16:08 crc kubenswrapper[4985]: I0224 10:16:08.353090 4985 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-rlq9j"] Feb 24 10:16:08 crc kubenswrapper[4985]: I0224 10:16:08.359866 4985 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-ptllg"] Feb 24 10:16:08 crc kubenswrapper[4985]: I0224 10:16:08.363794 4985 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-ptllg"] Feb 24 10:16:08 crc kubenswrapper[4985]: I0224 10:16:08.365355 4985 scope.go:117] "RemoveContainer" containerID="167acfba7cdf3cb566672b5c66f5af5dad7faf042c89cad0f57fafefc90d5cdd" Feb 24 10:16:08 crc kubenswrapper[4985]: I0224 10:16:08.369174 4985 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-rhlvp"] Feb 24 10:16:08 crc kubenswrapper[4985]: I0224 10:16:08.372695 4985 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-rhlvp"] Feb 24 10:16:08 crc kubenswrapper[4985]: I0224 10:16:08.383576 4985 scope.go:117] "RemoveContainer" containerID="3642100016b8d9597fa4ebee463d4b2a820e327e6bcd36dd9448cc560a2e42b7" Feb 24 10:16:08 crc kubenswrapper[4985]: I0224 10:16:08.396745 4985 scope.go:117] "RemoveContainer" containerID="09e4a5fca1f0926f07129896234ee5043d7c6da9fd3c83fd99e821c6979a7f1e" Feb 24 10:16:08 crc kubenswrapper[4985]: I0224 10:16:08.408045 4985 scope.go:117] "RemoveContainer" containerID="f97a4fe8d5def25f9a0ad14d50af81076a971b1a5ff9999faf958b7e0e497653" Feb 24 10:16:08 crc kubenswrapper[4985]: I0224 10:16:08.423695 4985 scope.go:117] "RemoveContainer" containerID="3661b5f3e491389b533ed8a120be3900c7adfbc604c62a0ba448f1aebf7bd121" Feb 24 10:16:08 crc kubenswrapper[4985]: I0224 10:16:08.891088 4985 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-zjj6q"] Feb 24 10:16:08 crc kubenswrapper[4985]: E0224 10:16:08.891938 4985 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d8f7b4e0-a896-4c14-bfea-fa6066bfd4c4" containerName="extract-content" Feb 24 10:16:08 crc kubenswrapper[4985]: I0224 10:16:08.891953 4985 state_mem.go:107] "Deleted CPUSet assignment" podUID="d8f7b4e0-a896-4c14-bfea-fa6066bfd4c4" containerName="extract-content" Feb 24 10:16:08 crc kubenswrapper[4985]: E0224 10:16:08.891964 4985 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6ef771ef-28ac-46c6-925e-f12a7a70b6c3" containerName="marketplace-operator" Feb 24 10:16:08 crc kubenswrapper[4985]: I0224 10:16:08.891972 4985 state_mem.go:107] "Deleted CPUSet assignment" podUID="6ef771ef-28ac-46c6-925e-f12a7a70b6c3" containerName="marketplace-operator" Feb 24 10:16:08 crc kubenswrapper[4985]: E0224 10:16:08.891988 4985 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5186b86d-7d8f-4ed5-b444-991eaf2a793e" containerName="extract-utilities" Feb 24 10:16:08 crc kubenswrapper[4985]: I0224 10:16:08.891996 4985 state_mem.go:107] "Deleted CPUSet assignment" podUID="5186b86d-7d8f-4ed5-b444-991eaf2a793e" containerName="extract-utilities" Feb 24 10:16:08 crc kubenswrapper[4985]: E0224 10:16:08.892012 4985 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5186b86d-7d8f-4ed5-b444-991eaf2a793e" containerName="registry-server" Feb 24 10:16:08 crc kubenswrapper[4985]: I0224 10:16:08.892020 4985 state_mem.go:107] "Deleted CPUSet assignment" podUID="5186b86d-7d8f-4ed5-b444-991eaf2a793e" containerName="registry-server" Feb 24 10:16:08 crc kubenswrapper[4985]: E0224 10:16:08.892036 4985 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2fd0e2bc-ed7e-4a38-b107-9217d349ad15" containerName="extract-utilities" Feb 24 10:16:08 crc kubenswrapper[4985]: I0224 10:16:08.892043 4985 state_mem.go:107] "Deleted CPUSet assignment" podUID="2fd0e2bc-ed7e-4a38-b107-9217d349ad15" containerName="extract-utilities" Feb 24 10:16:08 crc kubenswrapper[4985]: E0224 10:16:08.892059 4985 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6ef771ef-28ac-46c6-925e-f12a7a70b6c3" containerName="marketplace-operator" Feb 24 10:16:08 crc kubenswrapper[4985]: I0224 10:16:08.892066 4985 state_mem.go:107] "Deleted CPUSet assignment" podUID="6ef771ef-28ac-46c6-925e-f12a7a70b6c3" containerName="marketplace-operator" Feb 24 10:16:08 crc kubenswrapper[4985]: E0224 10:16:08.892081 4985 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2fd0e2bc-ed7e-4a38-b107-9217d349ad15" containerName="registry-server" Feb 24 10:16:08 crc kubenswrapper[4985]: I0224 10:16:08.892089 4985 state_mem.go:107] "Deleted CPUSet assignment" podUID="2fd0e2bc-ed7e-4a38-b107-9217d349ad15" containerName="registry-server" Feb 24 10:16:08 crc kubenswrapper[4985]: E0224 10:16:08.892098 4985 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fef137fa-cf9a-4695-a0b5-3863ec2ea3bf" containerName="registry-server" Feb 24 10:16:08 crc kubenswrapper[4985]: I0224 10:16:08.892110 4985 state_mem.go:107] "Deleted CPUSet assignment" podUID="fef137fa-cf9a-4695-a0b5-3863ec2ea3bf" containerName="registry-server" Feb 24 10:16:08 crc kubenswrapper[4985]: E0224 10:16:08.892120 4985 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fef137fa-cf9a-4695-a0b5-3863ec2ea3bf" containerName="extract-utilities" Feb 24 10:16:08 crc kubenswrapper[4985]: I0224 10:16:08.892129 4985 state_mem.go:107] "Deleted CPUSet assignment" podUID="fef137fa-cf9a-4695-a0b5-3863ec2ea3bf" containerName="extract-utilities" Feb 24 10:16:08 crc kubenswrapper[4985]: E0224 10:16:08.892146 4985 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5186b86d-7d8f-4ed5-b444-991eaf2a793e" containerName="extract-content" Feb 24 10:16:08 crc kubenswrapper[4985]: I0224 10:16:08.892153 4985 state_mem.go:107] "Deleted CPUSet assignment" podUID="5186b86d-7d8f-4ed5-b444-991eaf2a793e" containerName="extract-content" Feb 24 10:16:08 crc kubenswrapper[4985]: E0224 10:16:08.892165 4985 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d8f7b4e0-a896-4c14-bfea-fa6066bfd4c4" containerName="registry-server" Feb 24 10:16:08 crc kubenswrapper[4985]: I0224 10:16:08.892172 4985 state_mem.go:107] "Deleted CPUSet assignment" podUID="d8f7b4e0-a896-4c14-bfea-fa6066bfd4c4" containerName="registry-server" Feb 24 10:16:08 crc kubenswrapper[4985]: E0224 10:16:08.892187 4985 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d8f7b4e0-a896-4c14-bfea-fa6066bfd4c4" containerName="extract-utilities" Feb 24 10:16:08 crc kubenswrapper[4985]: I0224 10:16:08.892196 4985 state_mem.go:107] "Deleted CPUSet assignment" podUID="d8f7b4e0-a896-4c14-bfea-fa6066bfd4c4" containerName="extract-utilities" Feb 24 10:16:08 crc kubenswrapper[4985]: E0224 10:16:08.892216 4985 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2fd0e2bc-ed7e-4a38-b107-9217d349ad15" containerName="extract-content" Feb 24 10:16:08 crc kubenswrapper[4985]: I0224 10:16:08.892342 4985 state_mem.go:107] "Deleted CPUSet assignment" podUID="2fd0e2bc-ed7e-4a38-b107-9217d349ad15" containerName="extract-content" Feb 24 10:16:08 crc kubenswrapper[4985]: E0224 10:16:08.892362 4985 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fef137fa-cf9a-4695-a0b5-3863ec2ea3bf" containerName="extract-content" Feb 24 10:16:08 crc kubenswrapper[4985]: I0224 10:16:08.892372 4985 state_mem.go:107] "Deleted CPUSet assignment" podUID="fef137fa-cf9a-4695-a0b5-3863ec2ea3bf" containerName="extract-content" Feb 24 10:16:08 crc kubenswrapper[4985]: I0224 10:16:08.892602 4985 memory_manager.go:354] "RemoveStaleState removing state" podUID="2fd0e2bc-ed7e-4a38-b107-9217d349ad15" containerName="registry-server" Feb 24 10:16:08 crc kubenswrapper[4985]: I0224 10:16:08.892625 4985 memory_manager.go:354] "RemoveStaleState removing state" podUID="6ef771ef-28ac-46c6-925e-f12a7a70b6c3" containerName="marketplace-operator" Feb 24 10:16:08 crc kubenswrapper[4985]: I0224 10:16:08.892637 4985 memory_manager.go:354] "RemoveStaleState removing state" podUID="d8f7b4e0-a896-4c14-bfea-fa6066bfd4c4" containerName="registry-server" Feb 24 10:16:08 crc kubenswrapper[4985]: I0224 10:16:08.892656 4985 memory_manager.go:354] "RemoveStaleState removing state" podUID="6ef771ef-28ac-46c6-925e-f12a7a70b6c3" containerName="marketplace-operator" Feb 24 10:16:08 crc kubenswrapper[4985]: I0224 10:16:08.892673 4985 memory_manager.go:354] "RemoveStaleState removing state" podUID="5186b86d-7d8f-4ed5-b444-991eaf2a793e" containerName="registry-server" Feb 24 10:16:08 crc kubenswrapper[4985]: I0224 10:16:08.892688 4985 memory_manager.go:354] "RemoveStaleState removing state" podUID="fef137fa-cf9a-4695-a0b5-3863ec2ea3bf" containerName="registry-server" Feb 24 10:16:08 crc kubenswrapper[4985]: I0224 10:16:08.893940 4985 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-zjj6q" Feb 24 10:16:08 crc kubenswrapper[4985]: I0224 10:16:08.898446 4985 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Feb 24 10:16:08 crc kubenswrapper[4985]: I0224 10:16:08.908461 4985 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-zjj6q"] Feb 24 10:16:09 crc kubenswrapper[4985]: I0224 10:16:09.069067 4985 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fb15bbfa-2834-40e2-9785-1eb0672b0e8d-catalog-content\") pod \"redhat-marketplace-zjj6q\" (UID: \"fb15bbfa-2834-40e2-9785-1eb0672b0e8d\") " pod="openshift-marketplace/redhat-marketplace-zjj6q" Feb 24 10:16:09 crc kubenswrapper[4985]: I0224 10:16:09.069147 4985 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fb15bbfa-2834-40e2-9785-1eb0672b0e8d-utilities\") pod \"redhat-marketplace-zjj6q\" (UID: \"fb15bbfa-2834-40e2-9785-1eb0672b0e8d\") " pod="openshift-marketplace/redhat-marketplace-zjj6q" Feb 24 10:16:09 crc kubenswrapper[4985]: I0224 10:16:09.069180 4985 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-249b2\" (UniqueName: \"kubernetes.io/projected/fb15bbfa-2834-40e2-9785-1eb0672b0e8d-kube-api-access-249b2\") pod \"redhat-marketplace-zjj6q\" (UID: \"fb15bbfa-2834-40e2-9785-1eb0672b0e8d\") " pod="openshift-marketplace/redhat-marketplace-zjj6q" Feb 24 10:16:09 crc kubenswrapper[4985]: I0224 10:16:09.087826 4985 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-s4z6k"] Feb 24 10:16:09 crc kubenswrapper[4985]: I0224 10:16:09.088969 4985 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-s4z6k" Feb 24 10:16:09 crc kubenswrapper[4985]: I0224 10:16:09.091550 4985 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Feb 24 10:16:09 crc kubenswrapper[4985]: I0224 10:16:09.102398 4985 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-s4z6k"] Feb 24 10:16:09 crc kubenswrapper[4985]: I0224 10:16:09.170371 4985 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-249b2\" (UniqueName: \"kubernetes.io/projected/fb15bbfa-2834-40e2-9785-1eb0672b0e8d-kube-api-access-249b2\") pod \"redhat-marketplace-zjj6q\" (UID: \"fb15bbfa-2834-40e2-9785-1eb0672b0e8d\") " pod="openshift-marketplace/redhat-marketplace-zjj6q" Feb 24 10:16:09 crc kubenswrapper[4985]: I0224 10:16:09.170516 4985 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fb15bbfa-2834-40e2-9785-1eb0672b0e8d-catalog-content\") pod \"redhat-marketplace-zjj6q\" (UID: \"fb15bbfa-2834-40e2-9785-1eb0672b0e8d\") " pod="openshift-marketplace/redhat-marketplace-zjj6q" Feb 24 10:16:09 crc kubenswrapper[4985]: I0224 10:16:09.171127 4985 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fb15bbfa-2834-40e2-9785-1eb0672b0e8d-utilities\") pod \"redhat-marketplace-zjj6q\" (UID: \"fb15bbfa-2834-40e2-9785-1eb0672b0e8d\") " pod="openshift-marketplace/redhat-marketplace-zjj6q" Feb 24 10:16:09 crc kubenswrapper[4985]: I0224 10:16:09.171139 4985 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fb15bbfa-2834-40e2-9785-1eb0672b0e8d-catalog-content\") pod \"redhat-marketplace-zjj6q\" (UID: \"fb15bbfa-2834-40e2-9785-1eb0672b0e8d\") " pod="openshift-marketplace/redhat-marketplace-zjj6q" Feb 24 10:16:09 crc kubenswrapper[4985]: I0224 10:16:09.173639 4985 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fb15bbfa-2834-40e2-9785-1eb0672b0e8d-utilities\") pod \"redhat-marketplace-zjj6q\" (UID: \"fb15bbfa-2834-40e2-9785-1eb0672b0e8d\") " pod="openshift-marketplace/redhat-marketplace-zjj6q" Feb 24 10:16:09 crc kubenswrapper[4985]: I0224 10:16:09.192509 4985 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-249b2\" (UniqueName: \"kubernetes.io/projected/fb15bbfa-2834-40e2-9785-1eb0672b0e8d-kube-api-access-249b2\") pod \"redhat-marketplace-zjj6q\" (UID: \"fb15bbfa-2834-40e2-9785-1eb0672b0e8d\") " pod="openshift-marketplace/redhat-marketplace-zjj6q" Feb 24 10:16:09 crc kubenswrapper[4985]: I0224 10:16:09.224594 4985 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-zjj6q" Feb 24 10:16:09 crc kubenswrapper[4985]: I0224 10:16:09.275269 4985 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jwr9q\" (UniqueName: \"kubernetes.io/projected/c05d506f-dfe7-4e47-8645-0795d488aa18-kube-api-access-jwr9q\") pod \"certified-operators-s4z6k\" (UID: \"c05d506f-dfe7-4e47-8645-0795d488aa18\") " pod="openshift-marketplace/certified-operators-s4z6k" Feb 24 10:16:09 crc kubenswrapper[4985]: I0224 10:16:09.275321 4985 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c05d506f-dfe7-4e47-8645-0795d488aa18-catalog-content\") pod \"certified-operators-s4z6k\" (UID: \"c05d506f-dfe7-4e47-8645-0795d488aa18\") " pod="openshift-marketplace/certified-operators-s4z6k" Feb 24 10:16:09 crc kubenswrapper[4985]: I0224 10:16:09.275373 4985 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c05d506f-dfe7-4e47-8645-0795d488aa18-utilities\") pod \"certified-operators-s4z6k\" (UID: \"c05d506f-dfe7-4e47-8645-0795d488aa18\") " pod="openshift-marketplace/certified-operators-s4z6k" Feb 24 10:16:09 crc kubenswrapper[4985]: I0224 10:16:09.376866 4985 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c05d506f-dfe7-4e47-8645-0795d488aa18-utilities\") pod \"certified-operators-s4z6k\" (UID: \"c05d506f-dfe7-4e47-8645-0795d488aa18\") " pod="openshift-marketplace/certified-operators-s4z6k" Feb 24 10:16:09 crc kubenswrapper[4985]: I0224 10:16:09.377368 4985 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jwr9q\" (UniqueName: \"kubernetes.io/projected/c05d506f-dfe7-4e47-8645-0795d488aa18-kube-api-access-jwr9q\") pod \"certified-operators-s4z6k\" (UID: \"c05d506f-dfe7-4e47-8645-0795d488aa18\") " pod="openshift-marketplace/certified-operators-s4z6k" Feb 24 10:16:09 crc kubenswrapper[4985]: I0224 10:16:09.377409 4985 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c05d506f-dfe7-4e47-8645-0795d488aa18-catalog-content\") pod \"certified-operators-s4z6k\" (UID: \"c05d506f-dfe7-4e47-8645-0795d488aa18\") " pod="openshift-marketplace/certified-operators-s4z6k" Feb 24 10:16:09 crc kubenswrapper[4985]: I0224 10:16:09.378097 4985 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c05d506f-dfe7-4e47-8645-0795d488aa18-catalog-content\") pod \"certified-operators-s4z6k\" (UID: \"c05d506f-dfe7-4e47-8645-0795d488aa18\") " pod="openshift-marketplace/certified-operators-s4z6k" Feb 24 10:16:09 crc kubenswrapper[4985]: I0224 10:16:09.378230 4985 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c05d506f-dfe7-4e47-8645-0795d488aa18-utilities\") pod \"certified-operators-s4z6k\" (UID: \"c05d506f-dfe7-4e47-8645-0795d488aa18\") " pod="openshift-marketplace/certified-operators-s4z6k" Feb 24 10:16:09 crc kubenswrapper[4985]: I0224 10:16:09.395839 4985 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jwr9q\" (UniqueName: \"kubernetes.io/projected/c05d506f-dfe7-4e47-8645-0795d488aa18-kube-api-access-jwr9q\") pod \"certified-operators-s4z6k\" (UID: \"c05d506f-dfe7-4e47-8645-0795d488aa18\") " pod="openshift-marketplace/certified-operators-s4z6k" Feb 24 10:16:09 crc kubenswrapper[4985]: I0224 10:16:09.408455 4985 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-s4z6k" Feb 24 10:16:10 crc kubenswrapper[4985]: I0224 10:16:09.602699 4985 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-zjj6q"] Feb 24 10:16:10 crc kubenswrapper[4985]: W0224 10:16:09.610518 4985 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfb15bbfa_2834_40e2_9785_1eb0672b0e8d.slice/crio-462ca05913bb53174fd52dc5b013149f63f2ac6bbe9f3a43799cab0806ecfd54 WatchSource:0}: Error finding container 462ca05913bb53174fd52dc5b013149f63f2ac6bbe9f3a43799cab0806ecfd54: Status 404 returned error can't find the container with id 462ca05913bb53174fd52dc5b013149f63f2ac6bbe9f3a43799cab0806ecfd54 Feb 24 10:16:10 crc kubenswrapper[4985]: I0224 10:16:10.275812 4985 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2fd0e2bc-ed7e-4a38-b107-9217d349ad15" path="/var/lib/kubelet/pods/2fd0e2bc-ed7e-4a38-b107-9217d349ad15/volumes" Feb 24 10:16:10 crc kubenswrapper[4985]: I0224 10:16:10.276957 4985 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5186b86d-7d8f-4ed5-b444-991eaf2a793e" path="/var/lib/kubelet/pods/5186b86d-7d8f-4ed5-b444-991eaf2a793e/volumes" Feb 24 10:16:10 crc kubenswrapper[4985]: I0224 10:16:10.277614 4985 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6ef771ef-28ac-46c6-925e-f12a7a70b6c3" path="/var/lib/kubelet/pods/6ef771ef-28ac-46c6-925e-f12a7a70b6c3/volumes" Feb 24 10:16:10 crc kubenswrapper[4985]: I0224 10:16:10.278751 4985 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d8f7b4e0-a896-4c14-bfea-fa6066bfd4c4" path="/var/lib/kubelet/pods/d8f7b4e0-a896-4c14-bfea-fa6066bfd4c4/volumes" Feb 24 10:16:10 crc kubenswrapper[4985]: I0224 10:16:10.291854 4985 generic.go:334] "Generic (PLEG): container finished" podID="fb15bbfa-2834-40e2-9785-1eb0672b0e8d" containerID="363c7bc40af561d004b5edc330f09e80b069cde83911580c1181426a99cb6d8d" exitCode=0 Feb 24 10:16:10 crc kubenswrapper[4985]: I0224 10:16:10.291991 4985 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-zjj6q" event={"ID":"fb15bbfa-2834-40e2-9785-1eb0672b0e8d","Type":"ContainerDied","Data":"363c7bc40af561d004b5edc330f09e80b069cde83911580c1181426a99cb6d8d"} Feb 24 10:16:10 crc kubenswrapper[4985]: I0224 10:16:10.292067 4985 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-zjj6q" event={"ID":"fb15bbfa-2834-40e2-9785-1eb0672b0e8d","Type":"ContainerStarted","Data":"462ca05913bb53174fd52dc5b013149f63f2ac6bbe9f3a43799cab0806ecfd54"} Feb 24 10:16:10 crc kubenswrapper[4985]: I0224 10:16:10.540525 4985 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-s4z6k"] Feb 24 10:16:10 crc kubenswrapper[4985]: W0224 10:16:10.544694 4985 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc05d506f_dfe7_4e47_8645_0795d488aa18.slice/crio-24cc0187556c94952bddae6f960355746d5184a40c01f8c4bbf32d52fc8de663 WatchSource:0}: Error finding container 24cc0187556c94952bddae6f960355746d5184a40c01f8c4bbf32d52fc8de663: Status 404 returned error can't find the container with id 24cc0187556c94952bddae6f960355746d5184a40c01f8c4bbf32d52fc8de663 Feb 24 10:16:11 crc kubenswrapper[4985]: I0224 10:16:11.285159 4985 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-zkm8l"] Feb 24 10:16:11 crc kubenswrapper[4985]: I0224 10:16:11.290393 4985 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-zkm8l" Feb 24 10:16:11 crc kubenswrapper[4985]: I0224 10:16:11.291554 4985 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-zkm8l"] Feb 24 10:16:11 crc kubenswrapper[4985]: I0224 10:16:11.292957 4985 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Feb 24 10:16:11 crc kubenswrapper[4985]: I0224 10:16:11.312516 4985 generic.go:334] "Generic (PLEG): container finished" podID="fb15bbfa-2834-40e2-9785-1eb0672b0e8d" containerID="ad6f5087b7bdf2395e3e78ecc21e79cdf855d8cf87e2c457e4e02bd25cd726bd" exitCode=0 Feb 24 10:16:11 crc kubenswrapper[4985]: I0224 10:16:11.312591 4985 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-zjj6q" event={"ID":"fb15bbfa-2834-40e2-9785-1eb0672b0e8d","Type":"ContainerDied","Data":"ad6f5087b7bdf2395e3e78ecc21e79cdf855d8cf87e2c457e4e02bd25cd726bd"} Feb 24 10:16:11 crc kubenswrapper[4985]: I0224 10:16:11.315713 4985 generic.go:334] "Generic (PLEG): container finished" podID="c05d506f-dfe7-4e47-8645-0795d488aa18" containerID="e1c6bccdb835fc30fa6f272355370e08f25ccf7fdcbcd536260607e91b03c892" exitCode=0 Feb 24 10:16:11 crc kubenswrapper[4985]: I0224 10:16:11.315769 4985 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-s4z6k" event={"ID":"c05d506f-dfe7-4e47-8645-0795d488aa18","Type":"ContainerDied","Data":"e1c6bccdb835fc30fa6f272355370e08f25ccf7fdcbcd536260607e91b03c892"} Feb 24 10:16:11 crc kubenswrapper[4985]: I0224 10:16:11.315801 4985 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-s4z6k" event={"ID":"c05d506f-dfe7-4e47-8645-0795d488aa18","Type":"ContainerStarted","Data":"24cc0187556c94952bddae6f960355746d5184a40c01f8c4bbf32d52fc8de663"} Feb 24 10:16:11 crc kubenswrapper[4985]: I0224 10:16:11.425913 4985 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4f3b9c70-4e38-43b5-9a50-b315637ddb07-catalog-content\") pod \"redhat-operators-zkm8l\" (UID: \"4f3b9c70-4e38-43b5-9a50-b315637ddb07\") " pod="openshift-marketplace/redhat-operators-zkm8l" Feb 24 10:16:11 crc kubenswrapper[4985]: I0224 10:16:11.426424 4985 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4f3b9c70-4e38-43b5-9a50-b315637ddb07-utilities\") pod \"redhat-operators-zkm8l\" (UID: \"4f3b9c70-4e38-43b5-9a50-b315637ddb07\") " pod="openshift-marketplace/redhat-operators-zkm8l" Feb 24 10:16:11 crc kubenswrapper[4985]: I0224 10:16:11.426672 4985 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hzrzr\" (UniqueName: \"kubernetes.io/projected/4f3b9c70-4e38-43b5-9a50-b315637ddb07-kube-api-access-hzrzr\") pod \"redhat-operators-zkm8l\" (UID: \"4f3b9c70-4e38-43b5-9a50-b315637ddb07\") " pod="openshift-marketplace/redhat-operators-zkm8l" Feb 24 10:16:11 crc kubenswrapper[4985]: I0224 10:16:11.484990 4985 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-zwdpc"] Feb 24 10:16:11 crc kubenswrapper[4985]: I0224 10:16:11.486221 4985 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-zwdpc" Feb 24 10:16:11 crc kubenswrapper[4985]: I0224 10:16:11.490427 4985 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Feb 24 10:16:11 crc kubenswrapper[4985]: I0224 10:16:11.495345 4985 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-zwdpc"] Feb 24 10:16:11 crc kubenswrapper[4985]: I0224 10:16:11.528059 4985 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hzrzr\" (UniqueName: \"kubernetes.io/projected/4f3b9c70-4e38-43b5-9a50-b315637ddb07-kube-api-access-hzrzr\") pod \"redhat-operators-zkm8l\" (UID: \"4f3b9c70-4e38-43b5-9a50-b315637ddb07\") " pod="openshift-marketplace/redhat-operators-zkm8l" Feb 24 10:16:11 crc kubenswrapper[4985]: I0224 10:16:11.528118 4985 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4f3b9c70-4e38-43b5-9a50-b315637ddb07-catalog-content\") pod \"redhat-operators-zkm8l\" (UID: \"4f3b9c70-4e38-43b5-9a50-b315637ddb07\") " pod="openshift-marketplace/redhat-operators-zkm8l" Feb 24 10:16:11 crc kubenswrapper[4985]: I0224 10:16:11.528144 4985 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4f3b9c70-4e38-43b5-9a50-b315637ddb07-utilities\") pod \"redhat-operators-zkm8l\" (UID: \"4f3b9c70-4e38-43b5-9a50-b315637ddb07\") " pod="openshift-marketplace/redhat-operators-zkm8l" Feb 24 10:16:11 crc kubenswrapper[4985]: I0224 10:16:11.528191 4985 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6a013a8b-7078-4579-87a8-543ad810e056-utilities\") pod \"community-operators-zwdpc\" (UID: \"6a013a8b-7078-4579-87a8-543ad810e056\") " pod="openshift-marketplace/community-operators-zwdpc" Feb 24 10:16:11 crc kubenswrapper[4985]: I0224 10:16:11.528221 4985 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6a013a8b-7078-4579-87a8-543ad810e056-catalog-content\") pod \"community-operators-zwdpc\" (UID: \"6a013a8b-7078-4579-87a8-543ad810e056\") " pod="openshift-marketplace/community-operators-zwdpc" Feb 24 10:16:11 crc kubenswrapper[4985]: I0224 10:16:11.528240 4985 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5b6vd\" (UniqueName: \"kubernetes.io/projected/6a013a8b-7078-4579-87a8-543ad810e056-kube-api-access-5b6vd\") pod \"community-operators-zwdpc\" (UID: \"6a013a8b-7078-4579-87a8-543ad810e056\") " pod="openshift-marketplace/community-operators-zwdpc" Feb 24 10:16:11 crc kubenswrapper[4985]: I0224 10:16:11.528610 4985 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4f3b9c70-4e38-43b5-9a50-b315637ddb07-catalog-content\") pod \"redhat-operators-zkm8l\" (UID: \"4f3b9c70-4e38-43b5-9a50-b315637ddb07\") " pod="openshift-marketplace/redhat-operators-zkm8l" Feb 24 10:16:11 crc kubenswrapper[4985]: I0224 10:16:11.528634 4985 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4f3b9c70-4e38-43b5-9a50-b315637ddb07-utilities\") pod \"redhat-operators-zkm8l\" (UID: \"4f3b9c70-4e38-43b5-9a50-b315637ddb07\") " pod="openshift-marketplace/redhat-operators-zkm8l" Feb 24 10:16:11 crc kubenswrapper[4985]: I0224 10:16:11.549711 4985 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hzrzr\" (UniqueName: \"kubernetes.io/projected/4f3b9c70-4e38-43b5-9a50-b315637ddb07-kube-api-access-hzrzr\") pod \"redhat-operators-zkm8l\" (UID: \"4f3b9c70-4e38-43b5-9a50-b315637ddb07\") " pod="openshift-marketplace/redhat-operators-zkm8l" Feb 24 10:16:11 crc kubenswrapper[4985]: I0224 10:16:11.609962 4985 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-zkm8l" Feb 24 10:16:11 crc kubenswrapper[4985]: I0224 10:16:11.630642 4985 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6a013a8b-7078-4579-87a8-543ad810e056-utilities\") pod \"community-operators-zwdpc\" (UID: \"6a013a8b-7078-4579-87a8-543ad810e056\") " pod="openshift-marketplace/community-operators-zwdpc" Feb 24 10:16:11 crc kubenswrapper[4985]: I0224 10:16:11.630709 4985 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6a013a8b-7078-4579-87a8-543ad810e056-catalog-content\") pod \"community-operators-zwdpc\" (UID: \"6a013a8b-7078-4579-87a8-543ad810e056\") " pod="openshift-marketplace/community-operators-zwdpc" Feb 24 10:16:11 crc kubenswrapper[4985]: I0224 10:16:11.630748 4985 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5b6vd\" (UniqueName: \"kubernetes.io/projected/6a013a8b-7078-4579-87a8-543ad810e056-kube-api-access-5b6vd\") pod \"community-operators-zwdpc\" (UID: \"6a013a8b-7078-4579-87a8-543ad810e056\") " pod="openshift-marketplace/community-operators-zwdpc" Feb 24 10:16:11 crc kubenswrapper[4985]: I0224 10:16:11.631368 4985 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6a013a8b-7078-4579-87a8-543ad810e056-utilities\") pod \"community-operators-zwdpc\" (UID: \"6a013a8b-7078-4579-87a8-543ad810e056\") " pod="openshift-marketplace/community-operators-zwdpc" Feb 24 10:16:11 crc kubenswrapper[4985]: I0224 10:16:11.631568 4985 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6a013a8b-7078-4579-87a8-543ad810e056-catalog-content\") pod \"community-operators-zwdpc\" (UID: \"6a013a8b-7078-4579-87a8-543ad810e056\") " pod="openshift-marketplace/community-operators-zwdpc" Feb 24 10:16:11 crc kubenswrapper[4985]: I0224 10:16:11.650603 4985 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5b6vd\" (UniqueName: \"kubernetes.io/projected/6a013a8b-7078-4579-87a8-543ad810e056-kube-api-access-5b6vd\") pod \"community-operators-zwdpc\" (UID: \"6a013a8b-7078-4579-87a8-543ad810e056\") " pod="openshift-marketplace/community-operators-zwdpc" Feb 24 10:16:11 crc kubenswrapper[4985]: I0224 10:16:11.804636 4985 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-zwdpc" Feb 24 10:16:12 crc kubenswrapper[4985]: I0224 10:16:12.008834 4985 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-zkm8l"] Feb 24 10:16:12 crc kubenswrapper[4985]: W0224 10:16:12.012286 4985 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4f3b9c70_4e38_43b5_9a50_b315637ddb07.slice/crio-f708af5d8320e079327654133182e983c0fba272e53c588a5e4c7324011758b7 WatchSource:0}: Error finding container f708af5d8320e079327654133182e983c0fba272e53c588a5e4c7324011758b7: Status 404 returned error can't find the container with id f708af5d8320e079327654133182e983c0fba272e53c588a5e4c7324011758b7 Feb 24 10:16:12 crc kubenswrapper[4985]: I0224 10:16:12.182606 4985 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-zwdpc"] Feb 24 10:16:12 crc kubenswrapper[4985]: W0224 10:16:12.188078 4985 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6a013a8b_7078_4579_87a8_543ad810e056.slice/crio-e53072fb2c63fea6556b5fd5f68ee62fcb38c5fbec8ff4a235e4ff82e2fc1e08 WatchSource:0}: Error finding container e53072fb2c63fea6556b5fd5f68ee62fcb38c5fbec8ff4a235e4ff82e2fc1e08: Status 404 returned error can't find the container with id e53072fb2c63fea6556b5fd5f68ee62fcb38c5fbec8ff4a235e4ff82e2fc1e08 Feb 24 10:16:12 crc kubenswrapper[4985]: I0224 10:16:12.323543 4985 generic.go:334] "Generic (PLEG): container finished" podID="4f3b9c70-4e38-43b5-9a50-b315637ddb07" containerID="889ed4c712b090d8e27df83e4fa8c384cd3f5210b758ea67783f2dd609e8ef55" exitCode=0 Feb 24 10:16:12 crc kubenswrapper[4985]: I0224 10:16:12.323620 4985 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zkm8l" event={"ID":"4f3b9c70-4e38-43b5-9a50-b315637ddb07","Type":"ContainerDied","Data":"889ed4c712b090d8e27df83e4fa8c384cd3f5210b758ea67783f2dd609e8ef55"} Feb 24 10:16:12 crc kubenswrapper[4985]: I0224 10:16:12.323659 4985 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zkm8l" event={"ID":"4f3b9c70-4e38-43b5-9a50-b315637ddb07","Type":"ContainerStarted","Data":"f708af5d8320e079327654133182e983c0fba272e53c588a5e4c7324011758b7"} Feb 24 10:16:12 crc kubenswrapper[4985]: I0224 10:16:12.326376 4985 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zwdpc" event={"ID":"6a013a8b-7078-4579-87a8-543ad810e056","Type":"ContainerStarted","Data":"e53072fb2c63fea6556b5fd5f68ee62fcb38c5fbec8ff4a235e4ff82e2fc1e08"} Feb 24 10:16:12 crc kubenswrapper[4985]: I0224 10:16:12.332036 4985 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-s4z6k" event={"ID":"c05d506f-dfe7-4e47-8645-0795d488aa18","Type":"ContainerStarted","Data":"efc3aabf523c4e2fc4870f4aeb6c96e82562094fc8fca0ae3bade602adc53572"} Feb 24 10:16:12 crc kubenswrapper[4985]: I0224 10:16:12.334359 4985 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-zjj6q" event={"ID":"fb15bbfa-2834-40e2-9785-1eb0672b0e8d","Type":"ContainerStarted","Data":"103769d8450d8f9b8126c05eaa18f2e63b82fcbebfef8bbaffb700eb71fa4f01"} Feb 24 10:16:12 crc kubenswrapper[4985]: I0224 10:16:12.369648 4985 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-zjj6q" podStartSLOduration=2.671767114 podStartE2EDuration="4.369627167s" podCreationTimestamp="2026-02-24 10:16:08 +0000 UTC" firstStartedPulling="2026-02-24 10:16:10.29356413 +0000 UTC m=+454.767756690" lastFinishedPulling="2026-02-24 10:16:11.991424183 +0000 UTC m=+456.465616743" observedRunningTime="2026-02-24 10:16:12.36791913 +0000 UTC m=+456.842111690" watchObservedRunningTime="2026-02-24 10:16:12.369627167 +0000 UTC m=+456.843819727" Feb 24 10:16:13 crc kubenswrapper[4985]: I0224 10:16:13.353415 4985 generic.go:334] "Generic (PLEG): container finished" podID="6a013a8b-7078-4579-87a8-543ad810e056" containerID="035c96f6559e1abc539b9e79d940057c67004d8174dfa7676ef6f20a9ef912a7" exitCode=0 Feb 24 10:16:13 crc kubenswrapper[4985]: I0224 10:16:13.354268 4985 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zwdpc" event={"ID":"6a013a8b-7078-4579-87a8-543ad810e056","Type":"ContainerDied","Data":"035c96f6559e1abc539b9e79d940057c67004d8174dfa7676ef6f20a9ef912a7"} Feb 24 10:16:13 crc kubenswrapper[4985]: I0224 10:16:13.360499 4985 generic.go:334] "Generic (PLEG): container finished" podID="c05d506f-dfe7-4e47-8645-0795d488aa18" containerID="efc3aabf523c4e2fc4870f4aeb6c96e82562094fc8fca0ae3bade602adc53572" exitCode=0 Feb 24 10:16:13 crc kubenswrapper[4985]: I0224 10:16:13.360656 4985 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-s4z6k" event={"ID":"c05d506f-dfe7-4e47-8645-0795d488aa18","Type":"ContainerDied","Data":"efc3aabf523c4e2fc4870f4aeb6c96e82562094fc8fca0ae3bade602adc53572"} Feb 24 10:16:13 crc kubenswrapper[4985]: I0224 10:16:13.625212 4985 patch_prober.go:28] interesting pod/machine-config-daemon-hq52w container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 24 10:16:13 crc kubenswrapper[4985]: I0224 10:16:13.625295 4985 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hq52w" podUID="11c1c7b8-18df-4583-849f-76b62544344b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 24 10:16:13 crc kubenswrapper[4985]: I0224 10:16:13.625337 4985 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-hq52w" Feb 24 10:16:13 crc kubenswrapper[4985]: I0224 10:16:13.625943 4985 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"af9e8809fa21abba0bc6f989ce1de36b1b356edb744ae8b075b66b3c7afc91af"} pod="openshift-machine-config-operator/machine-config-daemon-hq52w" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 24 10:16:13 crc kubenswrapper[4985]: I0224 10:16:13.626010 4985 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-hq52w" podUID="11c1c7b8-18df-4583-849f-76b62544344b" containerName="machine-config-daemon" containerID="cri-o://af9e8809fa21abba0bc6f989ce1de36b1b356edb744ae8b075b66b3c7afc91af" gracePeriod=600 Feb 24 10:16:14 crc kubenswrapper[4985]: I0224 10:16:14.369303 4985 generic.go:334] "Generic (PLEG): container finished" podID="6a013a8b-7078-4579-87a8-543ad810e056" containerID="2f6a45bd236ed78604b361532a01c583ae012a231af78fd42083cdb28b6ba738" exitCode=0 Feb 24 10:16:14 crc kubenswrapper[4985]: I0224 10:16:14.369368 4985 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zwdpc" event={"ID":"6a013a8b-7078-4579-87a8-543ad810e056","Type":"ContainerDied","Data":"2f6a45bd236ed78604b361532a01c583ae012a231af78fd42083cdb28b6ba738"} Feb 24 10:16:14 crc kubenswrapper[4985]: I0224 10:16:14.371950 4985 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-s4z6k" event={"ID":"c05d506f-dfe7-4e47-8645-0795d488aa18","Type":"ContainerStarted","Data":"eedb41c3fb2a0fe8c769376611e44ec54bf135d559166cb7e1e978841bc0ebde"} Feb 24 10:16:14 crc kubenswrapper[4985]: I0224 10:16:14.374453 4985 generic.go:334] "Generic (PLEG): container finished" podID="11c1c7b8-18df-4583-849f-76b62544344b" containerID="af9e8809fa21abba0bc6f989ce1de36b1b356edb744ae8b075b66b3c7afc91af" exitCode=0 Feb 24 10:16:14 crc kubenswrapper[4985]: I0224 10:16:14.374494 4985 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-hq52w" event={"ID":"11c1c7b8-18df-4583-849f-76b62544344b","Type":"ContainerDied","Data":"af9e8809fa21abba0bc6f989ce1de36b1b356edb744ae8b075b66b3c7afc91af"} Feb 24 10:16:14 crc kubenswrapper[4985]: I0224 10:16:14.374512 4985 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-hq52w" event={"ID":"11c1c7b8-18df-4583-849f-76b62544344b","Type":"ContainerStarted","Data":"362c3bf398ab14b4ae57557da67c3d09f3b4f3cc2e68b6fa0d9b719d68656f2e"} Feb 24 10:16:14 crc kubenswrapper[4985]: I0224 10:16:14.374527 4985 scope.go:117] "RemoveContainer" containerID="14cb7316040a9cb3cb3236ed16ecc17eee99c6935a646b86604ea8285e1aab0b" Feb 24 10:16:14 crc kubenswrapper[4985]: I0224 10:16:14.377426 4985 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zkm8l" event={"ID":"4f3b9c70-4e38-43b5-9a50-b315637ddb07","Type":"ContainerStarted","Data":"3989811bd162e693deaf5da680e0de07dc7cd5bdb645f062293caed9141eee8e"} Feb 24 10:16:14 crc kubenswrapper[4985]: I0224 10:16:14.402723 4985 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-s4z6k" podStartSLOduration=2.983151294 podStartE2EDuration="5.402706154s" podCreationTimestamp="2026-02-24 10:16:09 +0000 UTC" firstStartedPulling="2026-02-24 10:16:11.317145024 +0000 UTC m=+455.791337584" lastFinishedPulling="2026-02-24 10:16:13.736699884 +0000 UTC m=+458.210892444" observedRunningTime="2026-02-24 10:16:14.399645811 +0000 UTC m=+458.873838371" watchObservedRunningTime="2026-02-24 10:16:14.402706154 +0000 UTC m=+458.876898714" Feb 24 10:16:15 crc kubenswrapper[4985]: I0224 10:16:15.401416 4985 generic.go:334] "Generic (PLEG): container finished" podID="4f3b9c70-4e38-43b5-9a50-b315637ddb07" containerID="3989811bd162e693deaf5da680e0de07dc7cd5bdb645f062293caed9141eee8e" exitCode=0 Feb 24 10:16:15 crc kubenswrapper[4985]: I0224 10:16:15.402506 4985 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zkm8l" event={"ID":"4f3b9c70-4e38-43b5-9a50-b315637ddb07","Type":"ContainerDied","Data":"3989811bd162e693deaf5da680e0de07dc7cd5bdb645f062293caed9141eee8e"} Feb 24 10:16:16 crc kubenswrapper[4985]: I0224 10:16:16.423366 4985 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zwdpc" event={"ID":"6a013a8b-7078-4579-87a8-543ad810e056","Type":"ContainerStarted","Data":"c458b1d7ef46c43a065a2594ab04a168509e955e27445fab749bf6586a927b82"} Feb 24 10:16:16 crc kubenswrapper[4985]: I0224 10:16:16.425516 4985 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zkm8l" event={"ID":"4f3b9c70-4e38-43b5-9a50-b315637ddb07","Type":"ContainerStarted","Data":"a0774353eaa3705d87a553645fb7600499847d299e3ac145a404774f565e8189"} Feb 24 10:16:16 crc kubenswrapper[4985]: I0224 10:16:16.441644 4985 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-zwdpc" podStartSLOduration=3.613022571 podStartE2EDuration="5.44162679s" podCreationTimestamp="2026-02-24 10:16:11 +0000 UTC" firstStartedPulling="2026-02-24 10:16:13.359012825 +0000 UTC m=+457.833205385" lastFinishedPulling="2026-02-24 10:16:15.187617044 +0000 UTC m=+459.661809604" observedRunningTime="2026-02-24 10:16:16.440233192 +0000 UTC m=+460.914425762" watchObservedRunningTime="2026-02-24 10:16:16.44162679 +0000 UTC m=+460.915819350" Feb 24 10:16:16 crc kubenswrapper[4985]: I0224 10:16:16.469954 4985 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-zkm8l" podStartSLOduration=3.049158879 podStartE2EDuration="5.469933977s" podCreationTimestamp="2026-02-24 10:16:11 +0000 UTC" firstStartedPulling="2026-02-24 10:16:13.363769884 +0000 UTC m=+457.837962444" lastFinishedPulling="2026-02-24 10:16:15.784544982 +0000 UTC m=+460.258737542" observedRunningTime="2026-02-24 10:16:16.467414029 +0000 UTC m=+460.941606609" watchObservedRunningTime="2026-02-24 10:16:16.469933977 +0000 UTC m=+460.944126557" Feb 24 10:16:19 crc kubenswrapper[4985]: I0224 10:16:19.225532 4985 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-zjj6q" Feb 24 10:16:19 crc kubenswrapper[4985]: I0224 10:16:19.227012 4985 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-zjj6q" Feb 24 10:16:19 crc kubenswrapper[4985]: I0224 10:16:19.269783 4985 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-zjj6q" Feb 24 10:16:19 crc kubenswrapper[4985]: I0224 10:16:19.409540 4985 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-s4z6k" Feb 24 10:16:19 crc kubenswrapper[4985]: I0224 10:16:19.409738 4985 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-s4z6k" Feb 24 10:16:19 crc kubenswrapper[4985]: I0224 10:16:19.454436 4985 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-s4z6k" Feb 24 10:16:19 crc kubenswrapper[4985]: I0224 10:16:19.483902 4985 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-zjj6q" Feb 24 10:16:19 crc kubenswrapper[4985]: I0224 10:16:19.497615 4985 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-s4z6k" Feb 24 10:16:21 crc kubenswrapper[4985]: I0224 10:16:21.611128 4985 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-zkm8l" Feb 24 10:16:21 crc kubenswrapper[4985]: I0224 10:16:21.611477 4985 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-zkm8l" Feb 24 10:16:21 crc kubenswrapper[4985]: I0224 10:16:21.649614 4985 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-zkm8l" Feb 24 10:16:21 crc kubenswrapper[4985]: I0224 10:16:21.805549 4985 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-zwdpc" Feb 24 10:16:21 crc kubenswrapper[4985]: I0224 10:16:21.805779 4985 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-zwdpc" Feb 24 10:16:21 crc kubenswrapper[4985]: I0224 10:16:21.841599 4985 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-zwdpc" Feb 24 10:16:22 crc kubenswrapper[4985]: I0224 10:16:22.518749 4985 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-zkm8l" Feb 24 10:16:22 crc kubenswrapper[4985]: I0224 10:16:22.521070 4985 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-zwdpc" Feb 24 10:16:24 crc kubenswrapper[4985]: I0224 10:16:24.751963 4985 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-66df7c8f76-7gp42" Feb 24 10:16:24 crc kubenswrapper[4985]: I0224 10:16:24.816687 4985 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-ggvx6"] Feb 24 10:16:49 crc kubenswrapper[4985]: I0224 10:16:49.862074 4985 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-image-registry/image-registry-697d97f7c8-ggvx6" podUID="6d51aec2-2b30-49d0-8aea-019bea882940" containerName="registry" containerID="cri-o://ca0a9a8cf47f414e0fb5023920285a538c942ba1d94d72baf956b0cc92c87a09" gracePeriod=30 Feb 24 10:16:50 crc kubenswrapper[4985]: I0224 10:16:50.235399 4985 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-ggvx6" Feb 24 10:16:50 crc kubenswrapper[4985]: I0224 10:16:50.342356 4985 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/6d51aec2-2b30-49d0-8aea-019bea882940-registry-tls\") pod \"6d51aec2-2b30-49d0-8aea-019bea882940\" (UID: \"6d51aec2-2b30-49d0-8aea-019bea882940\") " Feb 24 10:16:50 crc kubenswrapper[4985]: I0224 10:16:50.342418 4985 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/6d51aec2-2b30-49d0-8aea-019bea882940-installation-pull-secrets\") pod \"6d51aec2-2b30-49d0-8aea-019bea882940\" (UID: \"6d51aec2-2b30-49d0-8aea-019bea882940\") " Feb 24 10:16:50 crc kubenswrapper[4985]: I0224 10:16:50.342674 4985 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-storage\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"6d51aec2-2b30-49d0-8aea-019bea882940\" (UID: \"6d51aec2-2b30-49d0-8aea-019bea882940\") " Feb 24 10:16:50 crc kubenswrapper[4985]: I0224 10:16:50.342738 4985 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/6d51aec2-2b30-49d0-8aea-019bea882940-ca-trust-extracted\") pod \"6d51aec2-2b30-49d0-8aea-019bea882940\" (UID: \"6d51aec2-2b30-49d0-8aea-019bea882940\") " Feb 24 10:16:50 crc kubenswrapper[4985]: I0224 10:16:50.342809 4985 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/6d51aec2-2b30-49d0-8aea-019bea882940-trusted-ca\") pod \"6d51aec2-2b30-49d0-8aea-019bea882940\" (UID: \"6d51aec2-2b30-49d0-8aea-019bea882940\") " Feb 24 10:16:50 crc kubenswrapper[4985]: I0224 10:16:50.342849 4985 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/6d51aec2-2b30-49d0-8aea-019bea882940-registry-certificates\") pod \"6d51aec2-2b30-49d0-8aea-019bea882940\" (UID: \"6d51aec2-2b30-49d0-8aea-019bea882940\") " Feb 24 10:16:50 crc kubenswrapper[4985]: I0224 10:16:50.342931 4985 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/6d51aec2-2b30-49d0-8aea-019bea882940-bound-sa-token\") pod \"6d51aec2-2b30-49d0-8aea-019bea882940\" (UID: \"6d51aec2-2b30-49d0-8aea-019bea882940\") " Feb 24 10:16:50 crc kubenswrapper[4985]: I0224 10:16:50.342965 4985 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rxnmj\" (UniqueName: \"kubernetes.io/projected/6d51aec2-2b30-49d0-8aea-019bea882940-kube-api-access-rxnmj\") pod \"6d51aec2-2b30-49d0-8aea-019bea882940\" (UID: \"6d51aec2-2b30-49d0-8aea-019bea882940\") " Feb 24 10:16:50 crc kubenswrapper[4985]: I0224 10:16:50.344173 4985 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6d51aec2-2b30-49d0-8aea-019bea882940-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "6d51aec2-2b30-49d0-8aea-019bea882940" (UID: "6d51aec2-2b30-49d0-8aea-019bea882940"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 24 10:16:50 crc kubenswrapper[4985]: I0224 10:16:50.344183 4985 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6d51aec2-2b30-49d0-8aea-019bea882940-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "6d51aec2-2b30-49d0-8aea-019bea882940" (UID: "6d51aec2-2b30-49d0-8aea-019bea882940"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 24 10:16:50 crc kubenswrapper[4985]: I0224 10:16:50.346272 4985 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/6d51aec2-2b30-49d0-8aea-019bea882940-trusted-ca\") on node \"crc\" DevicePath \"\"" Feb 24 10:16:50 crc kubenswrapper[4985]: I0224 10:16:50.346313 4985 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/6d51aec2-2b30-49d0-8aea-019bea882940-registry-certificates\") on node \"crc\" DevicePath \"\"" Feb 24 10:16:50 crc kubenswrapper[4985]: I0224 10:16:50.348409 4985 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6d51aec2-2b30-49d0-8aea-019bea882940-kube-api-access-rxnmj" (OuterVolumeSpecName: "kube-api-access-rxnmj") pod "6d51aec2-2b30-49d0-8aea-019bea882940" (UID: "6d51aec2-2b30-49d0-8aea-019bea882940"). InnerVolumeSpecName "kube-api-access-rxnmj". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 24 10:16:50 crc kubenswrapper[4985]: I0224 10:16:50.350196 4985 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6d51aec2-2b30-49d0-8aea-019bea882940-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "6d51aec2-2b30-49d0-8aea-019bea882940" (UID: "6d51aec2-2b30-49d0-8aea-019bea882940"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 24 10:16:50 crc kubenswrapper[4985]: I0224 10:16:50.352424 4985 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6d51aec2-2b30-49d0-8aea-019bea882940-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "6d51aec2-2b30-49d0-8aea-019bea882940" (UID: "6d51aec2-2b30-49d0-8aea-019bea882940"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 24 10:16:50 crc kubenswrapper[4985]: I0224 10:16:50.360100 4985 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6d51aec2-2b30-49d0-8aea-019bea882940-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "6d51aec2-2b30-49d0-8aea-019bea882940" (UID: "6d51aec2-2b30-49d0-8aea-019bea882940"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 24 10:16:50 crc kubenswrapper[4985]: I0224 10:16:50.361216 4985 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "registry-storage") pod "6d51aec2-2b30-49d0-8aea-019bea882940" (UID: "6d51aec2-2b30-49d0-8aea-019bea882940"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Feb 24 10:16:50 crc kubenswrapper[4985]: I0224 10:16:50.361752 4985 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6d51aec2-2b30-49d0-8aea-019bea882940-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "6d51aec2-2b30-49d0-8aea-019bea882940" (UID: "6d51aec2-2b30-49d0-8aea-019bea882940"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 24 10:16:50 crc kubenswrapper[4985]: I0224 10:16:50.447354 4985 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/6d51aec2-2b30-49d0-8aea-019bea882940-bound-sa-token\") on node \"crc\" DevicePath \"\"" Feb 24 10:16:50 crc kubenswrapper[4985]: I0224 10:16:50.447381 4985 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rxnmj\" (UniqueName: \"kubernetes.io/projected/6d51aec2-2b30-49d0-8aea-019bea882940-kube-api-access-rxnmj\") on node \"crc\" DevicePath \"\"" Feb 24 10:16:50 crc kubenswrapper[4985]: I0224 10:16:50.447393 4985 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/6d51aec2-2b30-49d0-8aea-019bea882940-registry-tls\") on node \"crc\" DevicePath \"\"" Feb 24 10:16:50 crc kubenswrapper[4985]: I0224 10:16:50.447402 4985 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/6d51aec2-2b30-49d0-8aea-019bea882940-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Feb 24 10:16:50 crc kubenswrapper[4985]: I0224 10:16:50.447410 4985 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/6d51aec2-2b30-49d0-8aea-019bea882940-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Feb 24 10:16:50 crc kubenswrapper[4985]: I0224 10:16:50.624510 4985 generic.go:334] "Generic (PLEG): container finished" podID="6d51aec2-2b30-49d0-8aea-019bea882940" containerID="ca0a9a8cf47f414e0fb5023920285a538c942ba1d94d72baf956b0cc92c87a09" exitCode=0 Feb 24 10:16:50 crc kubenswrapper[4985]: I0224 10:16:50.624555 4985 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-ggvx6" Feb 24 10:16:50 crc kubenswrapper[4985]: I0224 10:16:50.624576 4985 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-ggvx6" event={"ID":"6d51aec2-2b30-49d0-8aea-019bea882940","Type":"ContainerDied","Data":"ca0a9a8cf47f414e0fb5023920285a538c942ba1d94d72baf956b0cc92c87a09"} Feb 24 10:16:50 crc kubenswrapper[4985]: I0224 10:16:50.624974 4985 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-ggvx6" event={"ID":"6d51aec2-2b30-49d0-8aea-019bea882940","Type":"ContainerDied","Data":"25d4eb64fe3c05fdae5b8092902a2b51b4d639f6b6465ab7c6bb6eb3a8ff8b59"} Feb 24 10:16:50 crc kubenswrapper[4985]: I0224 10:16:50.624995 4985 scope.go:117] "RemoveContainer" containerID="ca0a9a8cf47f414e0fb5023920285a538c942ba1d94d72baf956b0cc92c87a09" Feb 24 10:16:50 crc kubenswrapper[4985]: I0224 10:16:50.639878 4985 scope.go:117] "RemoveContainer" containerID="ca0a9a8cf47f414e0fb5023920285a538c942ba1d94d72baf956b0cc92c87a09" Feb 24 10:16:50 crc kubenswrapper[4985]: E0224 10:16:50.642214 4985 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ca0a9a8cf47f414e0fb5023920285a538c942ba1d94d72baf956b0cc92c87a09\": container with ID starting with ca0a9a8cf47f414e0fb5023920285a538c942ba1d94d72baf956b0cc92c87a09 not found: ID does not exist" containerID="ca0a9a8cf47f414e0fb5023920285a538c942ba1d94d72baf956b0cc92c87a09" Feb 24 10:16:50 crc kubenswrapper[4985]: I0224 10:16:50.642252 4985 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ca0a9a8cf47f414e0fb5023920285a538c942ba1d94d72baf956b0cc92c87a09"} err="failed to get container status \"ca0a9a8cf47f414e0fb5023920285a538c942ba1d94d72baf956b0cc92c87a09\": rpc error: code = NotFound desc = could not find container \"ca0a9a8cf47f414e0fb5023920285a538c942ba1d94d72baf956b0cc92c87a09\": container with ID starting with ca0a9a8cf47f414e0fb5023920285a538c942ba1d94d72baf956b0cc92c87a09 not found: ID does not exist" Feb 24 10:16:50 crc kubenswrapper[4985]: I0224 10:16:50.668036 4985 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-ggvx6"] Feb 24 10:16:50 crc kubenswrapper[4985]: I0224 10:16:50.682524 4985 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-ggvx6"] Feb 24 10:16:52 crc kubenswrapper[4985]: I0224 10:16:52.271638 4985 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6d51aec2-2b30-49d0-8aea-019bea882940" path="/var/lib/kubelet/pods/6d51aec2-2b30-49d0-8aea-019bea882940/volumes" Feb 24 10:18:13 crc kubenswrapper[4985]: I0224 10:18:13.625073 4985 patch_prober.go:28] interesting pod/machine-config-daemon-hq52w container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 24 10:18:13 crc kubenswrapper[4985]: I0224 10:18:13.625830 4985 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hq52w" podUID="11c1c7b8-18df-4583-849f-76b62544344b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 24 10:18:43 crc kubenswrapper[4985]: I0224 10:18:43.625044 4985 patch_prober.go:28] interesting pod/machine-config-daemon-hq52w container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 24 10:18:43 crc kubenswrapper[4985]: I0224 10:18:43.625674 4985 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hq52w" podUID="11c1c7b8-18df-4583-849f-76b62544344b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 24 10:19:13 crc kubenswrapper[4985]: I0224 10:19:13.625441 4985 patch_prober.go:28] interesting pod/machine-config-daemon-hq52w container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 24 10:19:13 crc kubenswrapper[4985]: I0224 10:19:13.626074 4985 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hq52w" podUID="11c1c7b8-18df-4583-849f-76b62544344b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 24 10:19:13 crc kubenswrapper[4985]: I0224 10:19:13.626136 4985 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-hq52w" Feb 24 10:19:13 crc kubenswrapper[4985]: I0224 10:19:13.626853 4985 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"362c3bf398ab14b4ae57557da67c3d09f3b4f3cc2e68b6fa0d9b719d68656f2e"} pod="openshift-machine-config-operator/machine-config-daemon-hq52w" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 24 10:19:13 crc kubenswrapper[4985]: I0224 10:19:13.626952 4985 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-hq52w" podUID="11c1c7b8-18df-4583-849f-76b62544344b" containerName="machine-config-daemon" containerID="cri-o://362c3bf398ab14b4ae57557da67c3d09f3b4f3cc2e68b6fa0d9b719d68656f2e" gracePeriod=600 Feb 24 10:19:14 crc kubenswrapper[4985]: I0224 10:19:14.423252 4985 generic.go:334] "Generic (PLEG): container finished" podID="11c1c7b8-18df-4583-849f-76b62544344b" containerID="362c3bf398ab14b4ae57557da67c3d09f3b4f3cc2e68b6fa0d9b719d68656f2e" exitCode=0 Feb 24 10:19:14 crc kubenswrapper[4985]: I0224 10:19:14.423333 4985 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-hq52w" event={"ID":"11c1c7b8-18df-4583-849f-76b62544344b","Type":"ContainerDied","Data":"362c3bf398ab14b4ae57557da67c3d09f3b4f3cc2e68b6fa0d9b719d68656f2e"} Feb 24 10:19:14 crc kubenswrapper[4985]: I0224 10:19:14.423529 4985 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-hq52w" event={"ID":"11c1c7b8-18df-4583-849f-76b62544344b","Type":"ContainerStarted","Data":"5c4f0d0bca61173f77c9ab28eb284d806888a49c4136cca4d749de1d6a14995f"} Feb 24 10:19:14 crc kubenswrapper[4985]: I0224 10:19:14.423550 4985 scope.go:117] "RemoveContainer" containerID="af9e8809fa21abba0bc6f989ce1de36b1b356edb744ae8b075b66b3c7afc91af" Feb 24 10:20:15 crc kubenswrapper[4985]: I0224 10:20:15.304602 4985 ???:1] "http: TLS handshake error from 192.168.126.11:42612: no serving certificate available for the kubelet" Feb 24 10:21:13 crc kubenswrapper[4985]: I0224 10:21:13.624480 4985 patch_prober.go:28] interesting pod/machine-config-daemon-hq52w container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 24 10:21:13 crc kubenswrapper[4985]: I0224 10:21:13.625226 4985 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hq52w" podUID="11c1c7b8-18df-4583-849f-76b62544344b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 24 10:21:28 crc kubenswrapper[4985]: I0224 10:21:28.236716 4985 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-cainjector-cf98fcc89-fj5pl"] Feb 24 10:21:28 crc kubenswrapper[4985]: E0224 10:21:28.237552 4985 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6d51aec2-2b30-49d0-8aea-019bea882940" containerName="registry" Feb 24 10:21:28 crc kubenswrapper[4985]: I0224 10:21:28.237571 4985 state_mem.go:107] "Deleted CPUSet assignment" podUID="6d51aec2-2b30-49d0-8aea-019bea882940" containerName="registry" Feb 24 10:21:28 crc kubenswrapper[4985]: I0224 10:21:28.237716 4985 memory_manager.go:354] "RemoveStaleState removing state" podUID="6d51aec2-2b30-49d0-8aea-019bea882940" containerName="registry" Feb 24 10:21:28 crc kubenswrapper[4985]: I0224 10:21:28.238171 4985 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-cf98fcc89-fj5pl" Feb 24 10:21:28 crc kubenswrapper[4985]: I0224 10:21:28.239974 4985 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"kube-root-ca.crt" Feb 24 10:21:28 crc kubenswrapper[4985]: I0224 10:21:28.240805 4985 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-cainjector-dockercfg-tbdq6" Feb 24 10:21:28 crc kubenswrapper[4985]: I0224 10:21:28.241098 4985 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"openshift-service-ca.crt" Feb 24 10:21:28 crc kubenswrapper[4985]: I0224 10:21:28.242054 4985 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-858654f9db-c6bjz"] Feb 24 10:21:28 crc kubenswrapper[4985]: I0224 10:21:28.242875 4985 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-858654f9db-c6bjz" Feb 24 10:21:28 crc kubenswrapper[4985]: I0224 10:21:28.244873 4985 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-dockercfg-gbzj2" Feb 24 10:21:28 crc kubenswrapper[4985]: I0224 10:21:28.247841 4985 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-cf98fcc89-fj5pl"] Feb 24 10:21:28 crc kubenswrapper[4985]: I0224 10:21:28.254333 4985 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-858654f9db-c6bjz"] Feb 24 10:21:28 crc kubenswrapper[4985]: I0224 10:21:28.259178 4985 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-webhook-687f57d79b-brbqk"] Feb 24 10:21:28 crc kubenswrapper[4985]: I0224 10:21:28.265686 4985 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-687f57d79b-brbqk" Feb 24 10:21:28 crc kubenswrapper[4985]: I0224 10:21:28.268308 4985 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-webhook-dockercfg-shnl5" Feb 24 10:21:28 crc kubenswrapper[4985]: I0224 10:21:28.274683 4985 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-687f57d79b-brbqk"] Feb 24 10:21:28 crc kubenswrapper[4985]: I0224 10:21:28.330088 4985 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vpnj5\" (UniqueName: \"kubernetes.io/projected/2907b666-a589-4626-ac57-15dec0eac559-kube-api-access-vpnj5\") pod \"cert-manager-cainjector-cf98fcc89-fj5pl\" (UID: \"2907b666-a589-4626-ac57-15dec0eac559\") " pod="cert-manager/cert-manager-cainjector-cf98fcc89-fj5pl" Feb 24 10:21:28 crc kubenswrapper[4985]: I0224 10:21:28.330132 4985 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hwnlb\" (UniqueName: \"kubernetes.io/projected/403fccac-6447-48b1-b794-2cb97388589d-kube-api-access-hwnlb\") pod \"cert-manager-858654f9db-c6bjz\" (UID: \"403fccac-6447-48b1-b794-2cb97388589d\") " pod="cert-manager/cert-manager-858654f9db-c6bjz" Feb 24 10:21:28 crc kubenswrapper[4985]: I0224 10:21:28.431741 4985 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-czzzb\" (UniqueName: \"kubernetes.io/projected/41424698-9c43-4660-9b59-a0afc5b00e7d-kube-api-access-czzzb\") pod \"cert-manager-webhook-687f57d79b-brbqk\" (UID: \"41424698-9c43-4660-9b59-a0afc5b00e7d\") " pod="cert-manager/cert-manager-webhook-687f57d79b-brbqk" Feb 24 10:21:28 crc kubenswrapper[4985]: I0224 10:21:28.431816 4985 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vpnj5\" (UniqueName: \"kubernetes.io/projected/2907b666-a589-4626-ac57-15dec0eac559-kube-api-access-vpnj5\") pod \"cert-manager-cainjector-cf98fcc89-fj5pl\" (UID: \"2907b666-a589-4626-ac57-15dec0eac559\") " pod="cert-manager/cert-manager-cainjector-cf98fcc89-fj5pl" Feb 24 10:21:28 crc kubenswrapper[4985]: I0224 10:21:28.431840 4985 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hwnlb\" (UniqueName: \"kubernetes.io/projected/403fccac-6447-48b1-b794-2cb97388589d-kube-api-access-hwnlb\") pod \"cert-manager-858654f9db-c6bjz\" (UID: \"403fccac-6447-48b1-b794-2cb97388589d\") " pod="cert-manager/cert-manager-858654f9db-c6bjz" Feb 24 10:21:28 crc kubenswrapper[4985]: I0224 10:21:28.451720 4985 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vpnj5\" (UniqueName: \"kubernetes.io/projected/2907b666-a589-4626-ac57-15dec0eac559-kube-api-access-vpnj5\") pod \"cert-manager-cainjector-cf98fcc89-fj5pl\" (UID: \"2907b666-a589-4626-ac57-15dec0eac559\") " pod="cert-manager/cert-manager-cainjector-cf98fcc89-fj5pl" Feb 24 10:21:28 crc kubenswrapper[4985]: I0224 10:21:28.458901 4985 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hwnlb\" (UniqueName: \"kubernetes.io/projected/403fccac-6447-48b1-b794-2cb97388589d-kube-api-access-hwnlb\") pod \"cert-manager-858654f9db-c6bjz\" (UID: \"403fccac-6447-48b1-b794-2cb97388589d\") " pod="cert-manager/cert-manager-858654f9db-c6bjz" Feb 24 10:21:28 crc kubenswrapper[4985]: I0224 10:21:28.533119 4985 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-czzzb\" (UniqueName: \"kubernetes.io/projected/41424698-9c43-4660-9b59-a0afc5b00e7d-kube-api-access-czzzb\") pod \"cert-manager-webhook-687f57d79b-brbqk\" (UID: \"41424698-9c43-4660-9b59-a0afc5b00e7d\") " pod="cert-manager/cert-manager-webhook-687f57d79b-brbqk" Feb 24 10:21:28 crc kubenswrapper[4985]: I0224 10:21:28.547248 4985 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-czzzb\" (UniqueName: \"kubernetes.io/projected/41424698-9c43-4660-9b59-a0afc5b00e7d-kube-api-access-czzzb\") pod \"cert-manager-webhook-687f57d79b-brbqk\" (UID: \"41424698-9c43-4660-9b59-a0afc5b00e7d\") " pod="cert-manager/cert-manager-webhook-687f57d79b-brbqk" Feb 24 10:21:28 crc kubenswrapper[4985]: I0224 10:21:28.566363 4985 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-cf98fcc89-fj5pl" Feb 24 10:21:28 crc kubenswrapper[4985]: I0224 10:21:28.578717 4985 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-858654f9db-c6bjz" Feb 24 10:21:28 crc kubenswrapper[4985]: I0224 10:21:28.586373 4985 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-687f57d79b-brbqk" Feb 24 10:21:28 crc kubenswrapper[4985]: I0224 10:21:28.795024 4985 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-858654f9db-c6bjz"] Feb 24 10:21:28 crc kubenswrapper[4985]: I0224 10:21:28.823424 4985 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 24 10:21:28 crc kubenswrapper[4985]: I0224 10:21:28.839767 4985 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-cf98fcc89-fj5pl"] Feb 24 10:21:28 crc kubenswrapper[4985]: W0224 10:21:28.847175 4985 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2907b666_a589_4626_ac57_15dec0eac559.slice/crio-f93fa48a26181b203af0cc750a314cd3229a23ed145ac66961add4b22796eec5 WatchSource:0}: Error finding container f93fa48a26181b203af0cc750a314cd3229a23ed145ac66961add4b22796eec5: Status 404 returned error can't find the container with id f93fa48a26181b203af0cc750a314cd3229a23ed145ac66961add4b22796eec5 Feb 24 10:21:28 crc kubenswrapper[4985]: I0224 10:21:28.875545 4985 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-687f57d79b-brbqk"] Feb 24 10:21:28 crc kubenswrapper[4985]: W0224 10:21:28.879567 4985 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod41424698_9c43_4660_9b59_a0afc5b00e7d.slice/crio-58c94d4f132311550773c5f3706c3effe8a6f5174918f81733f2f352ae4b1120 WatchSource:0}: Error finding container 58c94d4f132311550773c5f3706c3effe8a6f5174918f81733f2f352ae4b1120: Status 404 returned error can't find the container with id 58c94d4f132311550773c5f3706c3effe8a6f5174918f81733f2f352ae4b1120 Feb 24 10:21:29 crc kubenswrapper[4985]: I0224 10:21:29.244355 4985 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-687f57d79b-brbqk" event={"ID":"41424698-9c43-4660-9b59-a0afc5b00e7d","Type":"ContainerStarted","Data":"58c94d4f132311550773c5f3706c3effe8a6f5174918f81733f2f352ae4b1120"} Feb 24 10:21:29 crc kubenswrapper[4985]: I0224 10:21:29.246019 4985 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-858654f9db-c6bjz" event={"ID":"403fccac-6447-48b1-b794-2cb97388589d","Type":"ContainerStarted","Data":"fd907372d1d9669e1720135f704c35acc63bb70d55124b7ad3b7dca23b42d582"} Feb 24 10:21:29 crc kubenswrapper[4985]: I0224 10:21:29.247421 4985 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-cf98fcc89-fj5pl" event={"ID":"2907b666-a589-4626-ac57-15dec0eac559","Type":"ContainerStarted","Data":"f93fa48a26181b203af0cc750a314cd3229a23ed145ac66961add4b22796eec5"} Feb 24 10:21:33 crc kubenswrapper[4985]: I0224 10:21:33.267155 4985 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-cf98fcc89-fj5pl" event={"ID":"2907b666-a589-4626-ac57-15dec0eac559","Type":"ContainerStarted","Data":"085f7e36d94e9b9bb552c004a90f3282aa3e1b4195f3670c998020384b9f2f95"} Feb 24 10:21:33 crc kubenswrapper[4985]: I0224 10:21:33.269577 4985 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-687f57d79b-brbqk" event={"ID":"41424698-9c43-4660-9b59-a0afc5b00e7d","Type":"ContainerStarted","Data":"92823dc92ac2fe7b8fb37acd9792c1bd73eececca80d77bdb4f4192c8a47fda9"} Feb 24 10:21:33 crc kubenswrapper[4985]: I0224 10:21:33.269692 4985 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="cert-manager/cert-manager-webhook-687f57d79b-brbqk" Feb 24 10:21:33 crc kubenswrapper[4985]: I0224 10:21:33.272004 4985 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-858654f9db-c6bjz" event={"ID":"403fccac-6447-48b1-b794-2cb97388589d","Type":"ContainerStarted","Data":"c061a62bdb5c1e9f0025a6cf75f07b198996c47a37e87b798e96c2039f2bb2ed"} Feb 24 10:21:33 crc kubenswrapper[4985]: I0224 10:21:33.285791 4985 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-cainjector-cf98fcc89-fj5pl" podStartSLOduration=1.9056286249999999 podStartE2EDuration="5.285770002s" podCreationTimestamp="2026-02-24 10:21:28 +0000 UTC" firstStartedPulling="2026-02-24 10:21:28.849380832 +0000 UTC m=+773.323573392" lastFinishedPulling="2026-02-24 10:21:32.229522209 +0000 UTC m=+776.703714769" observedRunningTime="2026-02-24 10:21:33.280110247 +0000 UTC m=+777.754302817" watchObservedRunningTime="2026-02-24 10:21:33.285770002 +0000 UTC m=+777.759962572" Feb 24 10:21:33 crc kubenswrapper[4985]: I0224 10:21:33.302449 4985 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-webhook-687f57d79b-brbqk" podStartSLOduration=1.9589655339999998 podStartE2EDuration="5.302428378s" podCreationTimestamp="2026-02-24 10:21:28 +0000 UTC" firstStartedPulling="2026-02-24 10:21:28.885374486 +0000 UTC m=+773.359567046" lastFinishedPulling="2026-02-24 10:21:32.22883733 +0000 UTC m=+776.703029890" observedRunningTime="2026-02-24 10:21:33.29994394 +0000 UTC m=+777.774136560" watchObservedRunningTime="2026-02-24 10:21:33.302428378 +0000 UTC m=+777.776620958" Feb 24 10:21:33 crc kubenswrapper[4985]: I0224 10:21:33.319693 4985 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-858654f9db-c6bjz" podStartSLOduration=1.847477102 podStartE2EDuration="5.31966584s" podCreationTimestamp="2026-02-24 10:21:28 +0000 UTC" firstStartedPulling="2026-02-24 10:21:28.823107102 +0000 UTC m=+773.297299662" lastFinishedPulling="2026-02-24 10:21:32.29529584 +0000 UTC m=+776.769488400" observedRunningTime="2026-02-24 10:21:33.319577638 +0000 UTC m=+777.793770228" watchObservedRunningTime="2026-02-24 10:21:33.31966584 +0000 UTC m=+777.793858410" Feb 24 10:21:38 crc kubenswrapper[4985]: I0224 10:21:38.515314 4985 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-27dpt"] Feb 24 10:21:38 crc kubenswrapper[4985]: I0224 10:21:38.516581 4985 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-27dpt" podUID="1b3986ef-e9be-43db-9350-ccc7dd3f713f" containerName="ovn-controller" containerID="cri-o://7630d2e046fcee78924010a8de1aefcdf7c24ef4e70c4ea89c193983a0d88a61" gracePeriod=30 Feb 24 10:21:38 crc kubenswrapper[4985]: I0224 10:21:38.516676 4985 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-27dpt" podUID="1b3986ef-e9be-43db-9350-ccc7dd3f713f" containerName="nbdb" containerID="cri-o://3d76cc1e57df7809a67ce69b6690fd8cbe0ef9c364406b03bbb37bd110d65866" gracePeriod=30 Feb 24 10:21:38 crc kubenswrapper[4985]: I0224 10:21:38.516756 4985 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-27dpt" podUID="1b3986ef-e9be-43db-9350-ccc7dd3f713f" containerName="kube-rbac-proxy-ovn-metrics" containerID="cri-o://af211a7b5dbd1e1f86f340d63afe03ad2f9691d166810837cc5988a24ad4f323" gracePeriod=30 Feb 24 10:21:38 crc kubenswrapper[4985]: I0224 10:21:38.516816 4985 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-27dpt" podUID="1b3986ef-e9be-43db-9350-ccc7dd3f713f" containerName="ovn-acl-logging" containerID="cri-o://e04b766ff2b4eb366ace7cbf978ef84bfa922494ec3d28f4cbd75efe1cb54f1e" gracePeriod=30 Feb 24 10:21:38 crc kubenswrapper[4985]: I0224 10:21:38.516909 4985 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-27dpt" podUID="1b3986ef-e9be-43db-9350-ccc7dd3f713f" containerName="sbdb" containerID="cri-o://e3faa6c1361aca1d102b6dbfaf38fa3094a5c2334713ce70a8bf2f58f19d6392" gracePeriod=30 Feb 24 10:21:38 crc kubenswrapper[4985]: I0224 10:21:38.516787 4985 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-27dpt" podUID="1b3986ef-e9be-43db-9350-ccc7dd3f713f" containerName="kube-rbac-proxy-node" containerID="cri-o://97f91c2ddd00ceb58ea5691c7cb057bd7cfe8add5fd1dfd334f39b9def30db0d" gracePeriod=30 Feb 24 10:21:38 crc kubenswrapper[4985]: I0224 10:21:38.517022 4985 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-27dpt" podUID="1b3986ef-e9be-43db-9350-ccc7dd3f713f" containerName="northd" containerID="cri-o://cb7b5b1f27320e4bc00e48154c3bb41b1153e36c39a404cfc1d4a246f5909c5d" gracePeriod=30 Feb 24 10:21:38 crc kubenswrapper[4985]: I0224 10:21:38.575310 4985 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-27dpt" podUID="1b3986ef-e9be-43db-9350-ccc7dd3f713f" containerName="ovnkube-controller" containerID="cri-o://0deac0a06dc9dd969ef518376b34b0974f7314f897c1b21a0620a804e1bad55b" gracePeriod=30 Feb 24 10:21:38 crc kubenswrapper[4985]: I0224 10:21:38.591008 4985 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="cert-manager/cert-manager-webhook-687f57d79b-brbqk" Feb 24 10:21:38 crc kubenswrapper[4985]: I0224 10:21:38.822102 4985 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-27dpt_1b3986ef-e9be-43db-9350-ccc7dd3f713f/ovnkube-controller/3.log" Feb 24 10:21:38 crc kubenswrapper[4985]: I0224 10:21:38.823715 4985 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-27dpt_1b3986ef-e9be-43db-9350-ccc7dd3f713f/ovn-acl-logging/0.log" Feb 24 10:21:38 crc kubenswrapper[4985]: I0224 10:21:38.824090 4985 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-27dpt_1b3986ef-e9be-43db-9350-ccc7dd3f713f/ovn-controller/0.log" Feb 24 10:21:38 crc kubenswrapper[4985]: I0224 10:21:38.824495 4985 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-27dpt" Feb 24 10:21:38 crc kubenswrapper[4985]: I0224 10:21:38.877385 4985 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-mlwhn"] Feb 24 10:21:38 crc kubenswrapper[4985]: E0224 10:21:38.877701 4985 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1b3986ef-e9be-43db-9350-ccc7dd3f713f" containerName="ovnkube-controller" Feb 24 10:21:38 crc kubenswrapper[4985]: I0224 10:21:38.877735 4985 state_mem.go:107] "Deleted CPUSet assignment" podUID="1b3986ef-e9be-43db-9350-ccc7dd3f713f" containerName="ovnkube-controller" Feb 24 10:21:38 crc kubenswrapper[4985]: E0224 10:21:38.877744 4985 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1b3986ef-e9be-43db-9350-ccc7dd3f713f" containerName="northd" Feb 24 10:21:38 crc kubenswrapper[4985]: I0224 10:21:38.877751 4985 state_mem.go:107] "Deleted CPUSet assignment" podUID="1b3986ef-e9be-43db-9350-ccc7dd3f713f" containerName="northd" Feb 24 10:21:38 crc kubenswrapper[4985]: E0224 10:21:38.877759 4985 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1b3986ef-e9be-43db-9350-ccc7dd3f713f" containerName="sbdb" Feb 24 10:21:38 crc kubenswrapper[4985]: I0224 10:21:38.877766 4985 state_mem.go:107] "Deleted CPUSet assignment" podUID="1b3986ef-e9be-43db-9350-ccc7dd3f713f" containerName="sbdb" Feb 24 10:21:38 crc kubenswrapper[4985]: E0224 10:21:38.877773 4985 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1b3986ef-e9be-43db-9350-ccc7dd3f713f" containerName="kubecfg-setup" Feb 24 10:21:38 crc kubenswrapper[4985]: I0224 10:21:38.877779 4985 state_mem.go:107] "Deleted CPUSet assignment" podUID="1b3986ef-e9be-43db-9350-ccc7dd3f713f" containerName="kubecfg-setup" Feb 24 10:21:38 crc kubenswrapper[4985]: E0224 10:21:38.877810 4985 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1b3986ef-e9be-43db-9350-ccc7dd3f713f" containerName="ovnkube-controller" Feb 24 10:21:38 crc kubenswrapper[4985]: I0224 10:21:38.877815 4985 state_mem.go:107] "Deleted CPUSet assignment" podUID="1b3986ef-e9be-43db-9350-ccc7dd3f713f" containerName="ovnkube-controller" Feb 24 10:21:38 crc kubenswrapper[4985]: E0224 10:21:38.877824 4985 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1b3986ef-e9be-43db-9350-ccc7dd3f713f" containerName="ovnkube-controller" Feb 24 10:21:38 crc kubenswrapper[4985]: I0224 10:21:38.877830 4985 state_mem.go:107] "Deleted CPUSet assignment" podUID="1b3986ef-e9be-43db-9350-ccc7dd3f713f" containerName="ovnkube-controller" Feb 24 10:21:38 crc kubenswrapper[4985]: E0224 10:21:38.877839 4985 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1b3986ef-e9be-43db-9350-ccc7dd3f713f" containerName="kube-rbac-proxy-ovn-metrics" Feb 24 10:21:38 crc kubenswrapper[4985]: I0224 10:21:38.877845 4985 state_mem.go:107] "Deleted CPUSet assignment" podUID="1b3986ef-e9be-43db-9350-ccc7dd3f713f" containerName="kube-rbac-proxy-ovn-metrics" Feb 24 10:21:38 crc kubenswrapper[4985]: E0224 10:21:38.877854 4985 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1b3986ef-e9be-43db-9350-ccc7dd3f713f" containerName="ovn-controller" Feb 24 10:21:38 crc kubenswrapper[4985]: I0224 10:21:38.877859 4985 state_mem.go:107] "Deleted CPUSet assignment" podUID="1b3986ef-e9be-43db-9350-ccc7dd3f713f" containerName="ovn-controller" Feb 24 10:21:38 crc kubenswrapper[4985]: E0224 10:21:38.877902 4985 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1b3986ef-e9be-43db-9350-ccc7dd3f713f" containerName="kube-rbac-proxy-node" Feb 24 10:21:38 crc kubenswrapper[4985]: I0224 10:21:38.877908 4985 state_mem.go:107] "Deleted CPUSet assignment" podUID="1b3986ef-e9be-43db-9350-ccc7dd3f713f" containerName="kube-rbac-proxy-node" Feb 24 10:21:38 crc kubenswrapper[4985]: E0224 10:21:38.877916 4985 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1b3986ef-e9be-43db-9350-ccc7dd3f713f" containerName="nbdb" Feb 24 10:21:38 crc kubenswrapper[4985]: I0224 10:21:38.877921 4985 state_mem.go:107] "Deleted CPUSet assignment" podUID="1b3986ef-e9be-43db-9350-ccc7dd3f713f" containerName="nbdb" Feb 24 10:21:38 crc kubenswrapper[4985]: E0224 10:21:38.877931 4985 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1b3986ef-e9be-43db-9350-ccc7dd3f713f" containerName="ovn-acl-logging" Feb 24 10:21:38 crc kubenswrapper[4985]: I0224 10:21:38.877936 4985 state_mem.go:107] "Deleted CPUSet assignment" podUID="1b3986ef-e9be-43db-9350-ccc7dd3f713f" containerName="ovn-acl-logging" Feb 24 10:21:38 crc kubenswrapper[4985]: I0224 10:21:38.878089 4985 memory_manager.go:354] "RemoveStaleState removing state" podUID="1b3986ef-e9be-43db-9350-ccc7dd3f713f" containerName="kube-rbac-proxy-node" Feb 24 10:21:38 crc kubenswrapper[4985]: I0224 10:21:38.878099 4985 memory_manager.go:354] "RemoveStaleState removing state" podUID="1b3986ef-e9be-43db-9350-ccc7dd3f713f" containerName="ovnkube-controller" Feb 24 10:21:38 crc kubenswrapper[4985]: I0224 10:21:38.878105 4985 memory_manager.go:354] "RemoveStaleState removing state" podUID="1b3986ef-e9be-43db-9350-ccc7dd3f713f" containerName="ovn-controller" Feb 24 10:21:38 crc kubenswrapper[4985]: I0224 10:21:38.878113 4985 memory_manager.go:354] "RemoveStaleState removing state" podUID="1b3986ef-e9be-43db-9350-ccc7dd3f713f" containerName="ovnkube-controller" Feb 24 10:21:38 crc kubenswrapper[4985]: I0224 10:21:38.878120 4985 memory_manager.go:354] "RemoveStaleState removing state" podUID="1b3986ef-e9be-43db-9350-ccc7dd3f713f" containerName="kube-rbac-proxy-ovn-metrics" Feb 24 10:21:38 crc kubenswrapper[4985]: I0224 10:21:38.878150 4985 memory_manager.go:354] "RemoveStaleState removing state" podUID="1b3986ef-e9be-43db-9350-ccc7dd3f713f" containerName="northd" Feb 24 10:21:38 crc kubenswrapper[4985]: I0224 10:21:38.878159 4985 memory_manager.go:354] "RemoveStaleState removing state" podUID="1b3986ef-e9be-43db-9350-ccc7dd3f713f" containerName="ovnkube-controller" Feb 24 10:21:38 crc kubenswrapper[4985]: I0224 10:21:38.878166 4985 memory_manager.go:354] "RemoveStaleState removing state" podUID="1b3986ef-e9be-43db-9350-ccc7dd3f713f" containerName="ovn-acl-logging" Feb 24 10:21:38 crc kubenswrapper[4985]: I0224 10:21:38.878174 4985 memory_manager.go:354] "RemoveStaleState removing state" podUID="1b3986ef-e9be-43db-9350-ccc7dd3f713f" containerName="sbdb" Feb 24 10:21:38 crc kubenswrapper[4985]: I0224 10:21:38.878184 4985 memory_manager.go:354] "RemoveStaleState removing state" podUID="1b3986ef-e9be-43db-9350-ccc7dd3f713f" containerName="ovnkube-controller" Feb 24 10:21:38 crc kubenswrapper[4985]: I0224 10:21:38.878191 4985 memory_manager.go:354] "RemoveStaleState removing state" podUID="1b3986ef-e9be-43db-9350-ccc7dd3f713f" containerName="nbdb" Feb 24 10:21:38 crc kubenswrapper[4985]: E0224 10:21:38.878323 4985 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1b3986ef-e9be-43db-9350-ccc7dd3f713f" containerName="ovnkube-controller" Feb 24 10:21:38 crc kubenswrapper[4985]: I0224 10:21:38.878331 4985 state_mem.go:107] "Deleted CPUSet assignment" podUID="1b3986ef-e9be-43db-9350-ccc7dd3f713f" containerName="ovnkube-controller" Feb 24 10:21:38 crc kubenswrapper[4985]: E0224 10:21:38.878339 4985 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1b3986ef-e9be-43db-9350-ccc7dd3f713f" containerName="ovnkube-controller" Feb 24 10:21:38 crc kubenswrapper[4985]: I0224 10:21:38.878345 4985 state_mem.go:107] "Deleted CPUSet assignment" podUID="1b3986ef-e9be-43db-9350-ccc7dd3f713f" containerName="ovnkube-controller" Feb 24 10:21:38 crc kubenswrapper[4985]: I0224 10:21:38.878479 4985 memory_manager.go:354] "RemoveStaleState removing state" podUID="1b3986ef-e9be-43db-9350-ccc7dd3f713f" containerName="ovnkube-controller" Feb 24 10:21:38 crc kubenswrapper[4985]: I0224 10:21:38.880599 4985 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-mlwhn" Feb 24 10:21:38 crc kubenswrapper[4985]: I0224 10:21:38.979877 4985 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/1b3986ef-e9be-43db-9350-ccc7dd3f713f-host-kubelet\") pod \"1b3986ef-e9be-43db-9350-ccc7dd3f713f\" (UID: \"1b3986ef-e9be-43db-9350-ccc7dd3f713f\") " Feb 24 10:21:38 crc kubenswrapper[4985]: I0224 10:21:38.979958 4985 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9n6c9\" (UniqueName: \"kubernetes.io/projected/1b3986ef-e9be-43db-9350-ccc7dd3f713f-kube-api-access-9n6c9\") pod \"1b3986ef-e9be-43db-9350-ccc7dd3f713f\" (UID: \"1b3986ef-e9be-43db-9350-ccc7dd3f713f\") " Feb 24 10:21:38 crc kubenswrapper[4985]: I0224 10:21:38.979988 4985 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/1b3986ef-e9be-43db-9350-ccc7dd3f713f-host-cni-netd\") pod \"1b3986ef-e9be-43db-9350-ccc7dd3f713f\" (UID: \"1b3986ef-e9be-43db-9350-ccc7dd3f713f\") " Feb 24 10:21:38 crc kubenswrapper[4985]: I0224 10:21:38.980009 4985 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/1b3986ef-e9be-43db-9350-ccc7dd3f713f-run-ovn\") pod \"1b3986ef-e9be-43db-9350-ccc7dd3f713f\" (UID: \"1b3986ef-e9be-43db-9350-ccc7dd3f713f\") " Feb 24 10:21:38 crc kubenswrapper[4985]: I0224 10:21:38.980036 4985 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/1b3986ef-e9be-43db-9350-ccc7dd3f713f-ovnkube-script-lib\") pod \"1b3986ef-e9be-43db-9350-ccc7dd3f713f\" (UID: \"1b3986ef-e9be-43db-9350-ccc7dd3f713f\") " Feb 24 10:21:38 crc kubenswrapper[4985]: I0224 10:21:38.980052 4985 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/1b3986ef-e9be-43db-9350-ccc7dd3f713f-log-socket\") pod \"1b3986ef-e9be-43db-9350-ccc7dd3f713f\" (UID: \"1b3986ef-e9be-43db-9350-ccc7dd3f713f\") " Feb 24 10:21:38 crc kubenswrapper[4985]: I0224 10:21:38.980066 4985 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/1b3986ef-e9be-43db-9350-ccc7dd3f713f-node-log\") pod \"1b3986ef-e9be-43db-9350-ccc7dd3f713f\" (UID: \"1b3986ef-e9be-43db-9350-ccc7dd3f713f\") " Feb 24 10:21:38 crc kubenswrapper[4985]: I0224 10:21:38.980084 4985 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/1b3986ef-e9be-43db-9350-ccc7dd3f713f-host-run-netns\") pod \"1b3986ef-e9be-43db-9350-ccc7dd3f713f\" (UID: \"1b3986ef-e9be-43db-9350-ccc7dd3f713f\") " Feb 24 10:21:38 crc kubenswrapper[4985]: I0224 10:21:38.980099 4985 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/1b3986ef-e9be-43db-9350-ccc7dd3f713f-env-overrides\") pod \"1b3986ef-e9be-43db-9350-ccc7dd3f713f\" (UID: \"1b3986ef-e9be-43db-9350-ccc7dd3f713f\") " Feb 24 10:21:38 crc kubenswrapper[4985]: I0224 10:21:38.980115 4985 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/1b3986ef-e9be-43db-9350-ccc7dd3f713f-host-var-lib-cni-networks-ovn-kubernetes\") pod \"1b3986ef-e9be-43db-9350-ccc7dd3f713f\" (UID: \"1b3986ef-e9be-43db-9350-ccc7dd3f713f\") " Feb 24 10:21:38 crc kubenswrapper[4985]: I0224 10:21:38.980142 4985 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/1b3986ef-e9be-43db-9350-ccc7dd3f713f-ovn-node-metrics-cert\") pod \"1b3986ef-e9be-43db-9350-ccc7dd3f713f\" (UID: \"1b3986ef-e9be-43db-9350-ccc7dd3f713f\") " Feb 24 10:21:38 crc kubenswrapper[4985]: I0224 10:21:38.980162 4985 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/1b3986ef-e9be-43db-9350-ccc7dd3f713f-host-run-ovn-kubernetes\") pod \"1b3986ef-e9be-43db-9350-ccc7dd3f713f\" (UID: \"1b3986ef-e9be-43db-9350-ccc7dd3f713f\") " Feb 24 10:21:38 crc kubenswrapper[4985]: I0224 10:21:38.980182 4985 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/1b3986ef-e9be-43db-9350-ccc7dd3f713f-var-lib-openvswitch\") pod \"1b3986ef-e9be-43db-9350-ccc7dd3f713f\" (UID: \"1b3986ef-e9be-43db-9350-ccc7dd3f713f\") " Feb 24 10:21:38 crc kubenswrapper[4985]: I0224 10:21:38.980198 4985 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/1b3986ef-e9be-43db-9350-ccc7dd3f713f-host-cni-bin\") pod \"1b3986ef-e9be-43db-9350-ccc7dd3f713f\" (UID: \"1b3986ef-e9be-43db-9350-ccc7dd3f713f\") " Feb 24 10:21:38 crc kubenswrapper[4985]: I0224 10:21:38.980221 4985 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/1b3986ef-e9be-43db-9350-ccc7dd3f713f-systemd-units\") pod \"1b3986ef-e9be-43db-9350-ccc7dd3f713f\" (UID: \"1b3986ef-e9be-43db-9350-ccc7dd3f713f\") " Feb 24 10:21:38 crc kubenswrapper[4985]: I0224 10:21:38.980238 4985 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/1b3986ef-e9be-43db-9350-ccc7dd3f713f-etc-openvswitch\") pod \"1b3986ef-e9be-43db-9350-ccc7dd3f713f\" (UID: \"1b3986ef-e9be-43db-9350-ccc7dd3f713f\") " Feb 24 10:21:38 crc kubenswrapper[4985]: I0224 10:21:38.980260 4985 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/1b3986ef-e9be-43db-9350-ccc7dd3f713f-run-openvswitch\") pod \"1b3986ef-e9be-43db-9350-ccc7dd3f713f\" (UID: \"1b3986ef-e9be-43db-9350-ccc7dd3f713f\") " Feb 24 10:21:38 crc kubenswrapper[4985]: I0224 10:21:38.980280 4985 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/1b3986ef-e9be-43db-9350-ccc7dd3f713f-ovnkube-config\") pod \"1b3986ef-e9be-43db-9350-ccc7dd3f713f\" (UID: \"1b3986ef-e9be-43db-9350-ccc7dd3f713f\") " Feb 24 10:21:38 crc kubenswrapper[4985]: I0224 10:21:38.980297 4985 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/1b3986ef-e9be-43db-9350-ccc7dd3f713f-host-slash\") pod \"1b3986ef-e9be-43db-9350-ccc7dd3f713f\" (UID: \"1b3986ef-e9be-43db-9350-ccc7dd3f713f\") " Feb 24 10:21:38 crc kubenswrapper[4985]: I0224 10:21:38.980311 4985 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/1b3986ef-e9be-43db-9350-ccc7dd3f713f-run-systemd\") pod \"1b3986ef-e9be-43db-9350-ccc7dd3f713f\" (UID: \"1b3986ef-e9be-43db-9350-ccc7dd3f713f\") " Feb 24 10:21:38 crc kubenswrapper[4985]: I0224 10:21:38.980431 4985 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/276d922d-146f-4dac-b739-ba187cb0e0bc-run-systemd\") pod \"ovnkube-node-mlwhn\" (UID: \"276d922d-146f-4dac-b739-ba187cb0e0bc\") " pod="openshift-ovn-kubernetes/ovnkube-node-mlwhn" Feb 24 10:21:38 crc kubenswrapper[4985]: I0224 10:21:38.980455 4985 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/276d922d-146f-4dac-b739-ba187cb0e0bc-host-run-ovn-kubernetes\") pod \"ovnkube-node-mlwhn\" (UID: \"276d922d-146f-4dac-b739-ba187cb0e0bc\") " pod="openshift-ovn-kubernetes/ovnkube-node-mlwhn" Feb 24 10:21:38 crc kubenswrapper[4985]: I0224 10:21:38.980475 4985 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/276d922d-146f-4dac-b739-ba187cb0e0bc-log-socket\") pod \"ovnkube-node-mlwhn\" (UID: \"276d922d-146f-4dac-b739-ba187cb0e0bc\") " pod="openshift-ovn-kubernetes/ovnkube-node-mlwhn" Feb 24 10:21:38 crc kubenswrapper[4985]: I0224 10:21:38.980490 4985 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/276d922d-146f-4dac-b739-ba187cb0e0bc-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-mlwhn\" (UID: \"276d922d-146f-4dac-b739-ba187cb0e0bc\") " pod="openshift-ovn-kubernetes/ovnkube-node-mlwhn" Feb 24 10:21:38 crc kubenswrapper[4985]: I0224 10:21:38.980511 4985 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/276d922d-146f-4dac-b739-ba187cb0e0bc-host-run-netns\") pod \"ovnkube-node-mlwhn\" (UID: \"276d922d-146f-4dac-b739-ba187cb0e0bc\") " pod="openshift-ovn-kubernetes/ovnkube-node-mlwhn" Feb 24 10:21:38 crc kubenswrapper[4985]: I0224 10:21:38.980531 4985 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/276d922d-146f-4dac-b739-ba187cb0e0bc-run-ovn\") pod \"ovnkube-node-mlwhn\" (UID: \"276d922d-146f-4dac-b739-ba187cb0e0bc\") " pod="openshift-ovn-kubernetes/ovnkube-node-mlwhn" Feb 24 10:21:38 crc kubenswrapper[4985]: I0224 10:21:38.980553 4985 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/276d922d-146f-4dac-b739-ba187cb0e0bc-systemd-units\") pod \"ovnkube-node-mlwhn\" (UID: \"276d922d-146f-4dac-b739-ba187cb0e0bc\") " pod="openshift-ovn-kubernetes/ovnkube-node-mlwhn" Feb 24 10:21:38 crc kubenswrapper[4985]: I0224 10:21:38.980568 4985 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/276d922d-146f-4dac-b739-ba187cb0e0bc-var-lib-openvswitch\") pod \"ovnkube-node-mlwhn\" (UID: \"276d922d-146f-4dac-b739-ba187cb0e0bc\") " pod="openshift-ovn-kubernetes/ovnkube-node-mlwhn" Feb 24 10:21:38 crc kubenswrapper[4985]: I0224 10:21:38.980587 4985 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/276d922d-146f-4dac-b739-ba187cb0e0bc-ovnkube-config\") pod \"ovnkube-node-mlwhn\" (UID: \"276d922d-146f-4dac-b739-ba187cb0e0bc\") " pod="openshift-ovn-kubernetes/ovnkube-node-mlwhn" Feb 24 10:21:38 crc kubenswrapper[4985]: I0224 10:21:38.980604 4985 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/276d922d-146f-4dac-b739-ba187cb0e0bc-host-cni-bin\") pod \"ovnkube-node-mlwhn\" (UID: \"276d922d-146f-4dac-b739-ba187cb0e0bc\") " pod="openshift-ovn-kubernetes/ovnkube-node-mlwhn" Feb 24 10:21:38 crc kubenswrapper[4985]: I0224 10:21:38.980619 4985 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/276d922d-146f-4dac-b739-ba187cb0e0bc-ovnkube-script-lib\") pod \"ovnkube-node-mlwhn\" (UID: \"276d922d-146f-4dac-b739-ba187cb0e0bc\") " pod="openshift-ovn-kubernetes/ovnkube-node-mlwhn" Feb 24 10:21:38 crc kubenswrapper[4985]: I0224 10:21:38.980638 4985 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m9576\" (UniqueName: \"kubernetes.io/projected/276d922d-146f-4dac-b739-ba187cb0e0bc-kube-api-access-m9576\") pod \"ovnkube-node-mlwhn\" (UID: \"276d922d-146f-4dac-b739-ba187cb0e0bc\") " pod="openshift-ovn-kubernetes/ovnkube-node-mlwhn" Feb 24 10:21:38 crc kubenswrapper[4985]: I0224 10:21:38.980652 4985 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/276d922d-146f-4dac-b739-ba187cb0e0bc-host-kubelet\") pod \"ovnkube-node-mlwhn\" (UID: \"276d922d-146f-4dac-b739-ba187cb0e0bc\") " pod="openshift-ovn-kubernetes/ovnkube-node-mlwhn" Feb 24 10:21:38 crc kubenswrapper[4985]: I0224 10:21:38.980671 4985 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/276d922d-146f-4dac-b739-ba187cb0e0bc-host-cni-netd\") pod \"ovnkube-node-mlwhn\" (UID: \"276d922d-146f-4dac-b739-ba187cb0e0bc\") " pod="openshift-ovn-kubernetes/ovnkube-node-mlwhn" Feb 24 10:21:38 crc kubenswrapper[4985]: I0224 10:21:38.980688 4985 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/276d922d-146f-4dac-b739-ba187cb0e0bc-host-slash\") pod \"ovnkube-node-mlwhn\" (UID: \"276d922d-146f-4dac-b739-ba187cb0e0bc\") " pod="openshift-ovn-kubernetes/ovnkube-node-mlwhn" Feb 24 10:21:38 crc kubenswrapper[4985]: I0224 10:21:38.980702 4985 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/276d922d-146f-4dac-b739-ba187cb0e0bc-run-openvswitch\") pod \"ovnkube-node-mlwhn\" (UID: \"276d922d-146f-4dac-b739-ba187cb0e0bc\") " pod="openshift-ovn-kubernetes/ovnkube-node-mlwhn" Feb 24 10:21:38 crc kubenswrapper[4985]: I0224 10:21:38.980716 4985 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/276d922d-146f-4dac-b739-ba187cb0e0bc-node-log\") pod \"ovnkube-node-mlwhn\" (UID: \"276d922d-146f-4dac-b739-ba187cb0e0bc\") " pod="openshift-ovn-kubernetes/ovnkube-node-mlwhn" Feb 24 10:21:38 crc kubenswrapper[4985]: I0224 10:21:38.980729 4985 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/276d922d-146f-4dac-b739-ba187cb0e0bc-ovn-node-metrics-cert\") pod \"ovnkube-node-mlwhn\" (UID: \"276d922d-146f-4dac-b739-ba187cb0e0bc\") " pod="openshift-ovn-kubernetes/ovnkube-node-mlwhn" Feb 24 10:21:38 crc kubenswrapper[4985]: I0224 10:21:38.980745 4985 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/276d922d-146f-4dac-b739-ba187cb0e0bc-env-overrides\") pod \"ovnkube-node-mlwhn\" (UID: \"276d922d-146f-4dac-b739-ba187cb0e0bc\") " pod="openshift-ovn-kubernetes/ovnkube-node-mlwhn" Feb 24 10:21:38 crc kubenswrapper[4985]: I0224 10:21:38.980765 4985 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/276d922d-146f-4dac-b739-ba187cb0e0bc-etc-openvswitch\") pod \"ovnkube-node-mlwhn\" (UID: \"276d922d-146f-4dac-b739-ba187cb0e0bc\") " pod="openshift-ovn-kubernetes/ovnkube-node-mlwhn" Feb 24 10:21:38 crc kubenswrapper[4985]: I0224 10:21:38.980834 4985 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1b3986ef-e9be-43db-9350-ccc7dd3f713f-host-kubelet" (OuterVolumeSpecName: "host-kubelet") pod "1b3986ef-e9be-43db-9350-ccc7dd3f713f" (UID: "1b3986ef-e9be-43db-9350-ccc7dd3f713f"). InnerVolumeSpecName "host-kubelet". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 24 10:21:38 crc kubenswrapper[4985]: I0224 10:21:38.981098 4985 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1b3986ef-e9be-43db-9350-ccc7dd3f713f-host-cni-netd" (OuterVolumeSpecName: "host-cni-netd") pod "1b3986ef-e9be-43db-9350-ccc7dd3f713f" (UID: "1b3986ef-e9be-43db-9350-ccc7dd3f713f"). InnerVolumeSpecName "host-cni-netd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 24 10:21:38 crc kubenswrapper[4985]: I0224 10:21:38.981163 4985 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1b3986ef-e9be-43db-9350-ccc7dd3f713f-run-ovn" (OuterVolumeSpecName: "run-ovn") pod "1b3986ef-e9be-43db-9350-ccc7dd3f713f" (UID: "1b3986ef-e9be-43db-9350-ccc7dd3f713f"). InnerVolumeSpecName "run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 24 10:21:38 crc kubenswrapper[4985]: I0224 10:21:38.981223 4985 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1b3986ef-e9be-43db-9350-ccc7dd3f713f-var-lib-openvswitch" (OuterVolumeSpecName: "var-lib-openvswitch") pod "1b3986ef-e9be-43db-9350-ccc7dd3f713f" (UID: "1b3986ef-e9be-43db-9350-ccc7dd3f713f"). InnerVolumeSpecName "var-lib-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 24 10:21:38 crc kubenswrapper[4985]: I0224 10:21:38.981273 4985 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1b3986ef-e9be-43db-9350-ccc7dd3f713f-log-socket" (OuterVolumeSpecName: "log-socket") pod "1b3986ef-e9be-43db-9350-ccc7dd3f713f" (UID: "1b3986ef-e9be-43db-9350-ccc7dd3f713f"). InnerVolumeSpecName "log-socket". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 24 10:21:38 crc kubenswrapper[4985]: I0224 10:21:38.981295 4985 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1b3986ef-e9be-43db-9350-ccc7dd3f713f-node-log" (OuterVolumeSpecName: "node-log") pod "1b3986ef-e9be-43db-9350-ccc7dd3f713f" (UID: "1b3986ef-e9be-43db-9350-ccc7dd3f713f"). InnerVolumeSpecName "node-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 24 10:21:38 crc kubenswrapper[4985]: I0224 10:21:38.981313 4985 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1b3986ef-e9be-43db-9350-ccc7dd3f713f-host-run-netns" (OuterVolumeSpecName: "host-run-netns") pod "1b3986ef-e9be-43db-9350-ccc7dd3f713f" (UID: "1b3986ef-e9be-43db-9350-ccc7dd3f713f"). InnerVolumeSpecName "host-run-netns". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 24 10:21:38 crc kubenswrapper[4985]: I0224 10:21:38.981328 4985 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1b3986ef-e9be-43db-9350-ccc7dd3f713f-run-openvswitch" (OuterVolumeSpecName: "run-openvswitch") pod "1b3986ef-e9be-43db-9350-ccc7dd3f713f" (UID: "1b3986ef-e9be-43db-9350-ccc7dd3f713f"). InnerVolumeSpecName "run-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 24 10:21:38 crc kubenswrapper[4985]: I0224 10:21:38.981404 4985 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1b3986ef-e9be-43db-9350-ccc7dd3f713f-host-cni-bin" (OuterVolumeSpecName: "host-cni-bin") pod "1b3986ef-e9be-43db-9350-ccc7dd3f713f" (UID: "1b3986ef-e9be-43db-9350-ccc7dd3f713f"). InnerVolumeSpecName "host-cni-bin". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 24 10:21:38 crc kubenswrapper[4985]: I0224 10:21:38.981445 4985 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1b3986ef-e9be-43db-9350-ccc7dd3f713f-systemd-units" (OuterVolumeSpecName: "systemd-units") pod "1b3986ef-e9be-43db-9350-ccc7dd3f713f" (UID: "1b3986ef-e9be-43db-9350-ccc7dd3f713f"). InnerVolumeSpecName "systemd-units". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 24 10:21:38 crc kubenswrapper[4985]: I0224 10:21:38.981482 4985 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1b3986ef-e9be-43db-9350-ccc7dd3f713f-etc-openvswitch" (OuterVolumeSpecName: "etc-openvswitch") pod "1b3986ef-e9be-43db-9350-ccc7dd3f713f" (UID: "1b3986ef-e9be-43db-9350-ccc7dd3f713f"). InnerVolumeSpecName "etc-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 24 10:21:38 crc kubenswrapper[4985]: I0224 10:21:38.981586 4985 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1b3986ef-e9be-43db-9350-ccc7dd3f713f-host-var-lib-cni-networks-ovn-kubernetes" (OuterVolumeSpecName: "host-var-lib-cni-networks-ovn-kubernetes") pod "1b3986ef-e9be-43db-9350-ccc7dd3f713f" (UID: "1b3986ef-e9be-43db-9350-ccc7dd3f713f"). InnerVolumeSpecName "host-var-lib-cni-networks-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 24 10:21:38 crc kubenswrapper[4985]: I0224 10:21:38.981655 4985 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1b3986ef-e9be-43db-9350-ccc7dd3f713f-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "1b3986ef-e9be-43db-9350-ccc7dd3f713f" (UID: "1b3986ef-e9be-43db-9350-ccc7dd3f713f"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 24 10:21:38 crc kubenswrapper[4985]: I0224 10:21:38.981686 4985 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1b3986ef-e9be-43db-9350-ccc7dd3f713f-host-run-ovn-kubernetes" (OuterVolumeSpecName: "host-run-ovn-kubernetes") pod "1b3986ef-e9be-43db-9350-ccc7dd3f713f" (UID: "1b3986ef-e9be-43db-9350-ccc7dd3f713f"). InnerVolumeSpecName "host-run-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 24 10:21:38 crc kubenswrapper[4985]: I0224 10:21:38.981713 4985 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1b3986ef-e9be-43db-9350-ccc7dd3f713f-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "1b3986ef-e9be-43db-9350-ccc7dd3f713f" (UID: "1b3986ef-e9be-43db-9350-ccc7dd3f713f"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 24 10:21:38 crc kubenswrapper[4985]: I0224 10:21:38.981784 4985 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1b3986ef-e9be-43db-9350-ccc7dd3f713f-host-slash" (OuterVolumeSpecName: "host-slash") pod "1b3986ef-e9be-43db-9350-ccc7dd3f713f" (UID: "1b3986ef-e9be-43db-9350-ccc7dd3f713f"). InnerVolumeSpecName "host-slash". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 24 10:21:38 crc kubenswrapper[4985]: I0224 10:21:38.982198 4985 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1b3986ef-e9be-43db-9350-ccc7dd3f713f-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "1b3986ef-e9be-43db-9350-ccc7dd3f713f" (UID: "1b3986ef-e9be-43db-9350-ccc7dd3f713f"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 24 10:21:38 crc kubenswrapper[4985]: I0224 10:21:38.985970 4985 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1b3986ef-e9be-43db-9350-ccc7dd3f713f-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "1b3986ef-e9be-43db-9350-ccc7dd3f713f" (UID: "1b3986ef-e9be-43db-9350-ccc7dd3f713f"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 24 10:21:38 crc kubenswrapper[4985]: I0224 10:21:38.986412 4985 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1b3986ef-e9be-43db-9350-ccc7dd3f713f-kube-api-access-9n6c9" (OuterVolumeSpecName: "kube-api-access-9n6c9") pod "1b3986ef-e9be-43db-9350-ccc7dd3f713f" (UID: "1b3986ef-e9be-43db-9350-ccc7dd3f713f"). InnerVolumeSpecName "kube-api-access-9n6c9". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 24 10:21:38 crc kubenswrapper[4985]: I0224 10:21:38.993725 4985 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1b3986ef-e9be-43db-9350-ccc7dd3f713f-run-systemd" (OuterVolumeSpecName: "run-systemd") pod "1b3986ef-e9be-43db-9350-ccc7dd3f713f" (UID: "1b3986ef-e9be-43db-9350-ccc7dd3f713f"). InnerVolumeSpecName "run-systemd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 24 10:21:39 crc kubenswrapper[4985]: I0224 10:21:39.082118 4985 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/276d922d-146f-4dac-b739-ba187cb0e0bc-ovn-node-metrics-cert\") pod \"ovnkube-node-mlwhn\" (UID: \"276d922d-146f-4dac-b739-ba187cb0e0bc\") " pod="openshift-ovn-kubernetes/ovnkube-node-mlwhn" Feb 24 10:21:39 crc kubenswrapper[4985]: I0224 10:21:39.082166 4985 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/276d922d-146f-4dac-b739-ba187cb0e0bc-env-overrides\") pod \"ovnkube-node-mlwhn\" (UID: \"276d922d-146f-4dac-b739-ba187cb0e0bc\") " pod="openshift-ovn-kubernetes/ovnkube-node-mlwhn" Feb 24 10:21:39 crc kubenswrapper[4985]: I0224 10:21:39.082195 4985 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/276d922d-146f-4dac-b739-ba187cb0e0bc-etc-openvswitch\") pod \"ovnkube-node-mlwhn\" (UID: \"276d922d-146f-4dac-b739-ba187cb0e0bc\") " pod="openshift-ovn-kubernetes/ovnkube-node-mlwhn" Feb 24 10:21:39 crc kubenswrapper[4985]: I0224 10:21:39.082219 4985 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/276d922d-146f-4dac-b739-ba187cb0e0bc-run-systemd\") pod \"ovnkube-node-mlwhn\" (UID: \"276d922d-146f-4dac-b739-ba187cb0e0bc\") " pod="openshift-ovn-kubernetes/ovnkube-node-mlwhn" Feb 24 10:21:39 crc kubenswrapper[4985]: I0224 10:21:39.082239 4985 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/276d922d-146f-4dac-b739-ba187cb0e0bc-host-run-ovn-kubernetes\") pod \"ovnkube-node-mlwhn\" (UID: \"276d922d-146f-4dac-b739-ba187cb0e0bc\") " pod="openshift-ovn-kubernetes/ovnkube-node-mlwhn" Feb 24 10:21:39 crc kubenswrapper[4985]: I0224 10:21:39.082263 4985 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/276d922d-146f-4dac-b739-ba187cb0e0bc-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-mlwhn\" (UID: \"276d922d-146f-4dac-b739-ba187cb0e0bc\") " pod="openshift-ovn-kubernetes/ovnkube-node-mlwhn" Feb 24 10:21:39 crc kubenswrapper[4985]: I0224 10:21:39.082283 4985 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/276d922d-146f-4dac-b739-ba187cb0e0bc-log-socket\") pod \"ovnkube-node-mlwhn\" (UID: \"276d922d-146f-4dac-b739-ba187cb0e0bc\") " pod="openshift-ovn-kubernetes/ovnkube-node-mlwhn" Feb 24 10:21:39 crc kubenswrapper[4985]: I0224 10:21:39.082309 4985 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/276d922d-146f-4dac-b739-ba187cb0e0bc-host-run-netns\") pod \"ovnkube-node-mlwhn\" (UID: \"276d922d-146f-4dac-b739-ba187cb0e0bc\") " pod="openshift-ovn-kubernetes/ovnkube-node-mlwhn" Feb 24 10:21:39 crc kubenswrapper[4985]: I0224 10:21:39.082338 4985 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/276d922d-146f-4dac-b739-ba187cb0e0bc-run-ovn\") pod \"ovnkube-node-mlwhn\" (UID: \"276d922d-146f-4dac-b739-ba187cb0e0bc\") " pod="openshift-ovn-kubernetes/ovnkube-node-mlwhn" Feb 24 10:21:39 crc kubenswrapper[4985]: I0224 10:21:39.082362 4985 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/276d922d-146f-4dac-b739-ba187cb0e0bc-systemd-units\") pod \"ovnkube-node-mlwhn\" (UID: \"276d922d-146f-4dac-b739-ba187cb0e0bc\") " pod="openshift-ovn-kubernetes/ovnkube-node-mlwhn" Feb 24 10:21:39 crc kubenswrapper[4985]: I0224 10:21:39.082385 4985 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/276d922d-146f-4dac-b739-ba187cb0e0bc-var-lib-openvswitch\") pod \"ovnkube-node-mlwhn\" (UID: \"276d922d-146f-4dac-b739-ba187cb0e0bc\") " pod="openshift-ovn-kubernetes/ovnkube-node-mlwhn" Feb 24 10:21:39 crc kubenswrapper[4985]: I0224 10:21:39.082411 4985 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/276d922d-146f-4dac-b739-ba187cb0e0bc-ovnkube-config\") pod \"ovnkube-node-mlwhn\" (UID: \"276d922d-146f-4dac-b739-ba187cb0e0bc\") " pod="openshift-ovn-kubernetes/ovnkube-node-mlwhn" Feb 24 10:21:39 crc kubenswrapper[4985]: I0224 10:21:39.082435 4985 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/276d922d-146f-4dac-b739-ba187cb0e0bc-host-cni-bin\") pod \"ovnkube-node-mlwhn\" (UID: \"276d922d-146f-4dac-b739-ba187cb0e0bc\") " pod="openshift-ovn-kubernetes/ovnkube-node-mlwhn" Feb 24 10:21:39 crc kubenswrapper[4985]: I0224 10:21:39.082456 4985 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/276d922d-146f-4dac-b739-ba187cb0e0bc-ovnkube-script-lib\") pod \"ovnkube-node-mlwhn\" (UID: \"276d922d-146f-4dac-b739-ba187cb0e0bc\") " pod="openshift-ovn-kubernetes/ovnkube-node-mlwhn" Feb 24 10:21:39 crc kubenswrapper[4985]: I0224 10:21:39.082484 4985 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m9576\" (UniqueName: \"kubernetes.io/projected/276d922d-146f-4dac-b739-ba187cb0e0bc-kube-api-access-m9576\") pod \"ovnkube-node-mlwhn\" (UID: \"276d922d-146f-4dac-b739-ba187cb0e0bc\") " pod="openshift-ovn-kubernetes/ovnkube-node-mlwhn" Feb 24 10:21:39 crc kubenswrapper[4985]: I0224 10:21:39.082504 4985 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/276d922d-146f-4dac-b739-ba187cb0e0bc-host-kubelet\") pod \"ovnkube-node-mlwhn\" (UID: \"276d922d-146f-4dac-b739-ba187cb0e0bc\") " pod="openshift-ovn-kubernetes/ovnkube-node-mlwhn" Feb 24 10:21:39 crc kubenswrapper[4985]: I0224 10:21:39.082524 4985 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/276d922d-146f-4dac-b739-ba187cb0e0bc-host-cni-netd\") pod \"ovnkube-node-mlwhn\" (UID: \"276d922d-146f-4dac-b739-ba187cb0e0bc\") " pod="openshift-ovn-kubernetes/ovnkube-node-mlwhn" Feb 24 10:21:39 crc kubenswrapper[4985]: I0224 10:21:39.082546 4985 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/276d922d-146f-4dac-b739-ba187cb0e0bc-host-slash\") pod \"ovnkube-node-mlwhn\" (UID: \"276d922d-146f-4dac-b739-ba187cb0e0bc\") " pod="openshift-ovn-kubernetes/ovnkube-node-mlwhn" Feb 24 10:21:39 crc kubenswrapper[4985]: I0224 10:21:39.082564 4985 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/276d922d-146f-4dac-b739-ba187cb0e0bc-run-openvswitch\") pod \"ovnkube-node-mlwhn\" (UID: \"276d922d-146f-4dac-b739-ba187cb0e0bc\") " pod="openshift-ovn-kubernetes/ovnkube-node-mlwhn" Feb 24 10:21:39 crc kubenswrapper[4985]: I0224 10:21:39.082576 4985 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/276d922d-146f-4dac-b739-ba187cb0e0bc-node-log\") pod \"ovnkube-node-mlwhn\" (UID: \"276d922d-146f-4dac-b739-ba187cb0e0bc\") " pod="openshift-ovn-kubernetes/ovnkube-node-mlwhn" Feb 24 10:21:39 crc kubenswrapper[4985]: I0224 10:21:39.082614 4985 reconciler_common.go:293] "Volume detached for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/1b3986ef-e9be-43db-9350-ccc7dd3f713f-var-lib-openvswitch\") on node \"crc\" DevicePath \"\"" Feb 24 10:21:39 crc kubenswrapper[4985]: I0224 10:21:39.082624 4985 reconciler_common.go:293] "Volume detached for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/1b3986ef-e9be-43db-9350-ccc7dd3f713f-host-cni-bin\") on node \"crc\" DevicePath \"\"" Feb 24 10:21:39 crc kubenswrapper[4985]: I0224 10:21:39.082633 4985 reconciler_common.go:293] "Volume detached for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/1b3986ef-e9be-43db-9350-ccc7dd3f713f-systemd-units\") on node \"crc\" DevicePath \"\"" Feb 24 10:21:39 crc kubenswrapper[4985]: I0224 10:21:39.082663 4985 reconciler_common.go:293] "Volume detached for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/1b3986ef-e9be-43db-9350-ccc7dd3f713f-etc-openvswitch\") on node \"crc\" DevicePath \"\"" Feb 24 10:21:39 crc kubenswrapper[4985]: I0224 10:21:39.082672 4985 reconciler_common.go:293] "Volume detached for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/1b3986ef-e9be-43db-9350-ccc7dd3f713f-run-openvswitch\") on node \"crc\" DevicePath \"\"" Feb 24 10:21:39 crc kubenswrapper[4985]: I0224 10:21:39.082680 4985 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/1b3986ef-e9be-43db-9350-ccc7dd3f713f-ovnkube-config\") on node \"crc\" DevicePath \"\"" Feb 24 10:21:39 crc kubenswrapper[4985]: I0224 10:21:39.082688 4985 reconciler_common.go:293] "Volume detached for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/1b3986ef-e9be-43db-9350-ccc7dd3f713f-host-slash\") on node \"crc\" DevicePath \"\"" Feb 24 10:21:39 crc kubenswrapper[4985]: I0224 10:21:39.082696 4985 reconciler_common.go:293] "Volume detached for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/1b3986ef-e9be-43db-9350-ccc7dd3f713f-run-systemd\") on node \"crc\" DevicePath \"\"" Feb 24 10:21:39 crc kubenswrapper[4985]: I0224 10:21:39.082705 4985 reconciler_common.go:293] "Volume detached for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/1b3986ef-e9be-43db-9350-ccc7dd3f713f-host-kubelet\") on node \"crc\" DevicePath \"\"" Feb 24 10:21:39 crc kubenswrapper[4985]: I0224 10:21:39.082714 4985 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9n6c9\" (UniqueName: \"kubernetes.io/projected/1b3986ef-e9be-43db-9350-ccc7dd3f713f-kube-api-access-9n6c9\") on node \"crc\" DevicePath \"\"" Feb 24 10:21:39 crc kubenswrapper[4985]: I0224 10:21:39.082722 4985 reconciler_common.go:293] "Volume detached for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/1b3986ef-e9be-43db-9350-ccc7dd3f713f-host-cni-netd\") on node \"crc\" DevicePath \"\"" Feb 24 10:21:39 crc kubenswrapper[4985]: I0224 10:21:39.082730 4985 reconciler_common.go:293] "Volume detached for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/1b3986ef-e9be-43db-9350-ccc7dd3f713f-run-ovn\") on node \"crc\" DevicePath \"\"" Feb 24 10:21:39 crc kubenswrapper[4985]: I0224 10:21:39.082738 4985 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/1b3986ef-e9be-43db-9350-ccc7dd3f713f-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Feb 24 10:21:39 crc kubenswrapper[4985]: I0224 10:21:39.082746 4985 reconciler_common.go:293] "Volume detached for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/1b3986ef-e9be-43db-9350-ccc7dd3f713f-log-socket\") on node \"crc\" DevicePath \"\"" Feb 24 10:21:39 crc kubenswrapper[4985]: I0224 10:21:39.082754 4985 reconciler_common.go:293] "Volume detached for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/1b3986ef-e9be-43db-9350-ccc7dd3f713f-node-log\") on node \"crc\" DevicePath \"\"" Feb 24 10:21:39 crc kubenswrapper[4985]: I0224 10:21:39.082762 4985 reconciler_common.go:293] "Volume detached for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/1b3986ef-e9be-43db-9350-ccc7dd3f713f-host-run-netns\") on node \"crc\" DevicePath \"\"" Feb 24 10:21:39 crc kubenswrapper[4985]: I0224 10:21:39.082770 4985 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/1b3986ef-e9be-43db-9350-ccc7dd3f713f-env-overrides\") on node \"crc\" DevicePath \"\"" Feb 24 10:21:39 crc kubenswrapper[4985]: I0224 10:21:39.082779 4985 reconciler_common.go:293] "Volume detached for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/1b3986ef-e9be-43db-9350-ccc7dd3f713f-host-var-lib-cni-networks-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Feb 24 10:21:39 crc kubenswrapper[4985]: I0224 10:21:39.082788 4985 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/1b3986ef-e9be-43db-9350-ccc7dd3f713f-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Feb 24 10:21:39 crc kubenswrapper[4985]: I0224 10:21:39.082796 4985 reconciler_common.go:293] "Volume detached for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/1b3986ef-e9be-43db-9350-ccc7dd3f713f-host-run-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Feb 24 10:21:39 crc kubenswrapper[4985]: I0224 10:21:39.082839 4985 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/276d922d-146f-4dac-b739-ba187cb0e0bc-node-log\") pod \"ovnkube-node-mlwhn\" (UID: \"276d922d-146f-4dac-b739-ba187cb0e0bc\") " pod="openshift-ovn-kubernetes/ovnkube-node-mlwhn" Feb 24 10:21:39 crc kubenswrapper[4985]: I0224 10:21:39.083105 4985 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/276d922d-146f-4dac-b739-ba187cb0e0bc-systemd-units\") pod \"ovnkube-node-mlwhn\" (UID: \"276d922d-146f-4dac-b739-ba187cb0e0bc\") " pod="openshift-ovn-kubernetes/ovnkube-node-mlwhn" Feb 24 10:21:39 crc kubenswrapper[4985]: I0224 10:21:39.083241 4985 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/276d922d-146f-4dac-b739-ba187cb0e0bc-host-cni-netd\") pod \"ovnkube-node-mlwhn\" (UID: \"276d922d-146f-4dac-b739-ba187cb0e0bc\") " pod="openshift-ovn-kubernetes/ovnkube-node-mlwhn" Feb 24 10:21:39 crc kubenswrapper[4985]: I0224 10:21:39.083322 4985 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/276d922d-146f-4dac-b739-ba187cb0e0bc-host-kubelet\") pod \"ovnkube-node-mlwhn\" (UID: \"276d922d-146f-4dac-b739-ba187cb0e0bc\") " pod="openshift-ovn-kubernetes/ovnkube-node-mlwhn" Feb 24 10:21:39 crc kubenswrapper[4985]: I0224 10:21:39.083391 4985 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/276d922d-146f-4dac-b739-ba187cb0e0bc-host-slash\") pod \"ovnkube-node-mlwhn\" (UID: \"276d922d-146f-4dac-b739-ba187cb0e0bc\") " pod="openshift-ovn-kubernetes/ovnkube-node-mlwhn" Feb 24 10:21:39 crc kubenswrapper[4985]: I0224 10:21:39.083441 4985 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/276d922d-146f-4dac-b739-ba187cb0e0bc-run-openvswitch\") pod \"ovnkube-node-mlwhn\" (UID: \"276d922d-146f-4dac-b739-ba187cb0e0bc\") " pod="openshift-ovn-kubernetes/ovnkube-node-mlwhn" Feb 24 10:21:39 crc kubenswrapper[4985]: I0224 10:21:39.083467 4985 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/276d922d-146f-4dac-b739-ba187cb0e0bc-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-mlwhn\" (UID: \"276d922d-146f-4dac-b739-ba187cb0e0bc\") " pod="openshift-ovn-kubernetes/ovnkube-node-mlwhn" Feb 24 10:21:39 crc kubenswrapper[4985]: I0224 10:21:39.083534 4985 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/276d922d-146f-4dac-b739-ba187cb0e0bc-etc-openvswitch\") pod \"ovnkube-node-mlwhn\" (UID: \"276d922d-146f-4dac-b739-ba187cb0e0bc\") " pod="openshift-ovn-kubernetes/ovnkube-node-mlwhn" Feb 24 10:21:39 crc kubenswrapper[4985]: I0224 10:21:39.083541 4985 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/276d922d-146f-4dac-b739-ba187cb0e0bc-host-run-netns\") pod \"ovnkube-node-mlwhn\" (UID: \"276d922d-146f-4dac-b739-ba187cb0e0bc\") " pod="openshift-ovn-kubernetes/ovnkube-node-mlwhn" Feb 24 10:21:39 crc kubenswrapper[4985]: I0224 10:21:39.083607 4985 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/276d922d-146f-4dac-b739-ba187cb0e0bc-run-systemd\") pod \"ovnkube-node-mlwhn\" (UID: \"276d922d-146f-4dac-b739-ba187cb0e0bc\") " pod="openshift-ovn-kubernetes/ovnkube-node-mlwhn" Feb 24 10:21:39 crc kubenswrapper[4985]: I0224 10:21:39.083618 4985 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/276d922d-146f-4dac-b739-ba187cb0e0bc-log-socket\") pod \"ovnkube-node-mlwhn\" (UID: \"276d922d-146f-4dac-b739-ba187cb0e0bc\") " pod="openshift-ovn-kubernetes/ovnkube-node-mlwhn" Feb 24 10:21:39 crc kubenswrapper[4985]: I0224 10:21:39.083681 4985 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/276d922d-146f-4dac-b739-ba187cb0e0bc-host-run-ovn-kubernetes\") pod \"ovnkube-node-mlwhn\" (UID: \"276d922d-146f-4dac-b739-ba187cb0e0bc\") " pod="openshift-ovn-kubernetes/ovnkube-node-mlwhn" Feb 24 10:21:39 crc kubenswrapper[4985]: I0224 10:21:39.083693 4985 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/276d922d-146f-4dac-b739-ba187cb0e0bc-host-cni-bin\") pod \"ovnkube-node-mlwhn\" (UID: \"276d922d-146f-4dac-b739-ba187cb0e0bc\") " pod="openshift-ovn-kubernetes/ovnkube-node-mlwhn" Feb 24 10:21:39 crc kubenswrapper[4985]: I0224 10:21:39.083695 4985 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/276d922d-146f-4dac-b739-ba187cb0e0bc-var-lib-openvswitch\") pod \"ovnkube-node-mlwhn\" (UID: \"276d922d-146f-4dac-b739-ba187cb0e0bc\") " pod="openshift-ovn-kubernetes/ovnkube-node-mlwhn" Feb 24 10:21:39 crc kubenswrapper[4985]: I0224 10:21:39.083790 4985 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/276d922d-146f-4dac-b739-ba187cb0e0bc-run-ovn\") pod \"ovnkube-node-mlwhn\" (UID: \"276d922d-146f-4dac-b739-ba187cb0e0bc\") " pod="openshift-ovn-kubernetes/ovnkube-node-mlwhn" Feb 24 10:21:39 crc kubenswrapper[4985]: I0224 10:21:39.083882 4985 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/276d922d-146f-4dac-b739-ba187cb0e0bc-env-overrides\") pod \"ovnkube-node-mlwhn\" (UID: \"276d922d-146f-4dac-b739-ba187cb0e0bc\") " pod="openshift-ovn-kubernetes/ovnkube-node-mlwhn" Feb 24 10:21:39 crc kubenswrapper[4985]: I0224 10:21:39.084772 4985 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/276d922d-146f-4dac-b739-ba187cb0e0bc-ovnkube-config\") pod \"ovnkube-node-mlwhn\" (UID: \"276d922d-146f-4dac-b739-ba187cb0e0bc\") " pod="openshift-ovn-kubernetes/ovnkube-node-mlwhn" Feb 24 10:21:39 crc kubenswrapper[4985]: I0224 10:21:39.084827 4985 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/276d922d-146f-4dac-b739-ba187cb0e0bc-ovnkube-script-lib\") pod \"ovnkube-node-mlwhn\" (UID: \"276d922d-146f-4dac-b739-ba187cb0e0bc\") " pod="openshift-ovn-kubernetes/ovnkube-node-mlwhn" Feb 24 10:21:39 crc kubenswrapper[4985]: I0224 10:21:39.086499 4985 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/276d922d-146f-4dac-b739-ba187cb0e0bc-ovn-node-metrics-cert\") pod \"ovnkube-node-mlwhn\" (UID: \"276d922d-146f-4dac-b739-ba187cb0e0bc\") " pod="openshift-ovn-kubernetes/ovnkube-node-mlwhn" Feb 24 10:21:39 crc kubenswrapper[4985]: I0224 10:21:39.103938 4985 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m9576\" (UniqueName: \"kubernetes.io/projected/276d922d-146f-4dac-b739-ba187cb0e0bc-kube-api-access-m9576\") pod \"ovnkube-node-mlwhn\" (UID: \"276d922d-146f-4dac-b739-ba187cb0e0bc\") " pod="openshift-ovn-kubernetes/ovnkube-node-mlwhn" Feb 24 10:21:39 crc kubenswrapper[4985]: I0224 10:21:39.195397 4985 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-mlwhn" Feb 24 10:21:39 crc kubenswrapper[4985]: W0224 10:21:39.226032 4985 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod276d922d_146f_4dac_b739_ba187cb0e0bc.slice/crio-a6aa5357df19e4f33fb00be38dc6281dc473fd1fc2851f1f3965f47d2cd737be WatchSource:0}: Error finding container a6aa5357df19e4f33fb00be38dc6281dc473fd1fc2851f1f3965f47d2cd737be: Status 404 returned error can't find the container with id a6aa5357df19e4f33fb00be38dc6281dc473fd1fc2851f1f3965f47d2cd737be Feb 24 10:21:39 crc kubenswrapper[4985]: I0224 10:21:39.311168 4985 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-27dpt_1b3986ef-e9be-43db-9350-ccc7dd3f713f/ovnkube-controller/3.log" Feb 24 10:21:39 crc kubenswrapper[4985]: I0224 10:21:39.314056 4985 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-27dpt_1b3986ef-e9be-43db-9350-ccc7dd3f713f/ovn-acl-logging/0.log" Feb 24 10:21:39 crc kubenswrapper[4985]: I0224 10:21:39.314706 4985 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-27dpt_1b3986ef-e9be-43db-9350-ccc7dd3f713f/ovn-controller/0.log" Feb 24 10:21:39 crc kubenswrapper[4985]: I0224 10:21:39.315174 4985 generic.go:334] "Generic (PLEG): container finished" podID="1b3986ef-e9be-43db-9350-ccc7dd3f713f" containerID="0deac0a06dc9dd969ef518376b34b0974f7314f897c1b21a0620a804e1bad55b" exitCode=0 Feb 24 10:21:39 crc kubenswrapper[4985]: I0224 10:21:39.315203 4985 generic.go:334] "Generic (PLEG): container finished" podID="1b3986ef-e9be-43db-9350-ccc7dd3f713f" containerID="e3faa6c1361aca1d102b6dbfaf38fa3094a5c2334713ce70a8bf2f58f19d6392" exitCode=0 Feb 24 10:21:39 crc kubenswrapper[4985]: I0224 10:21:39.315212 4985 generic.go:334] "Generic (PLEG): container finished" podID="1b3986ef-e9be-43db-9350-ccc7dd3f713f" containerID="3d76cc1e57df7809a67ce69b6690fd8cbe0ef9c364406b03bbb37bd110d65866" exitCode=0 Feb 24 10:21:39 crc kubenswrapper[4985]: I0224 10:21:39.315220 4985 generic.go:334] "Generic (PLEG): container finished" podID="1b3986ef-e9be-43db-9350-ccc7dd3f713f" containerID="cb7b5b1f27320e4bc00e48154c3bb41b1153e36c39a404cfc1d4a246f5909c5d" exitCode=0 Feb 24 10:21:39 crc kubenswrapper[4985]: I0224 10:21:39.315226 4985 generic.go:334] "Generic (PLEG): container finished" podID="1b3986ef-e9be-43db-9350-ccc7dd3f713f" containerID="af211a7b5dbd1e1f86f340d63afe03ad2f9691d166810837cc5988a24ad4f323" exitCode=0 Feb 24 10:21:39 crc kubenswrapper[4985]: I0224 10:21:39.315233 4985 generic.go:334] "Generic (PLEG): container finished" podID="1b3986ef-e9be-43db-9350-ccc7dd3f713f" containerID="97f91c2ddd00ceb58ea5691c7cb057bd7cfe8add5fd1dfd334f39b9def30db0d" exitCode=0 Feb 24 10:21:39 crc kubenswrapper[4985]: I0224 10:21:39.315240 4985 generic.go:334] "Generic (PLEG): container finished" podID="1b3986ef-e9be-43db-9350-ccc7dd3f713f" containerID="e04b766ff2b4eb366ace7cbf978ef84bfa922494ec3d28f4cbd75efe1cb54f1e" exitCode=143 Feb 24 10:21:39 crc kubenswrapper[4985]: I0224 10:21:39.315246 4985 generic.go:334] "Generic (PLEG): container finished" podID="1b3986ef-e9be-43db-9350-ccc7dd3f713f" containerID="7630d2e046fcee78924010a8de1aefcdf7c24ef4e70c4ea89c193983a0d88a61" exitCode=143 Feb 24 10:21:39 crc kubenswrapper[4985]: I0224 10:21:39.315291 4985 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-27dpt" event={"ID":"1b3986ef-e9be-43db-9350-ccc7dd3f713f","Type":"ContainerDied","Data":"0deac0a06dc9dd969ef518376b34b0974f7314f897c1b21a0620a804e1bad55b"} Feb 24 10:21:39 crc kubenswrapper[4985]: I0224 10:21:39.315323 4985 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-27dpt" Feb 24 10:21:39 crc kubenswrapper[4985]: I0224 10:21:39.315353 4985 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-27dpt" event={"ID":"1b3986ef-e9be-43db-9350-ccc7dd3f713f","Type":"ContainerDied","Data":"e3faa6c1361aca1d102b6dbfaf38fa3094a5c2334713ce70a8bf2f58f19d6392"} Feb 24 10:21:39 crc kubenswrapper[4985]: I0224 10:21:39.315375 4985 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-27dpt" event={"ID":"1b3986ef-e9be-43db-9350-ccc7dd3f713f","Type":"ContainerDied","Data":"3d76cc1e57df7809a67ce69b6690fd8cbe0ef9c364406b03bbb37bd110d65866"} Feb 24 10:21:39 crc kubenswrapper[4985]: I0224 10:21:39.315403 4985 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-27dpt" event={"ID":"1b3986ef-e9be-43db-9350-ccc7dd3f713f","Type":"ContainerDied","Data":"cb7b5b1f27320e4bc00e48154c3bb41b1153e36c39a404cfc1d4a246f5909c5d"} Feb 24 10:21:39 crc kubenswrapper[4985]: I0224 10:21:39.315423 4985 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-27dpt" event={"ID":"1b3986ef-e9be-43db-9350-ccc7dd3f713f","Type":"ContainerDied","Data":"af211a7b5dbd1e1f86f340d63afe03ad2f9691d166810837cc5988a24ad4f323"} Feb 24 10:21:39 crc kubenswrapper[4985]: I0224 10:21:39.315441 4985 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-27dpt" event={"ID":"1b3986ef-e9be-43db-9350-ccc7dd3f713f","Type":"ContainerDied","Data":"97f91c2ddd00ceb58ea5691c7cb057bd7cfe8add5fd1dfd334f39b9def30db0d"} Feb 24 10:21:39 crc kubenswrapper[4985]: I0224 10:21:39.315375 4985 scope.go:117] "RemoveContainer" containerID="0deac0a06dc9dd969ef518376b34b0974f7314f897c1b21a0620a804e1bad55b" Feb 24 10:21:39 crc kubenswrapper[4985]: I0224 10:21:39.315459 4985 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"df7ed5513fec3c391f8a44d1039dd8a9d0e7c587428383737726821dfab5d0b1"} Feb 24 10:21:39 crc kubenswrapper[4985]: I0224 10:21:39.315544 4985 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"e3faa6c1361aca1d102b6dbfaf38fa3094a5c2334713ce70a8bf2f58f19d6392"} Feb 24 10:21:39 crc kubenswrapper[4985]: I0224 10:21:39.315555 4985 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"3d76cc1e57df7809a67ce69b6690fd8cbe0ef9c364406b03bbb37bd110d65866"} Feb 24 10:21:39 crc kubenswrapper[4985]: I0224 10:21:39.315561 4985 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"cb7b5b1f27320e4bc00e48154c3bb41b1153e36c39a404cfc1d4a246f5909c5d"} Feb 24 10:21:39 crc kubenswrapper[4985]: I0224 10:21:39.315569 4985 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"af211a7b5dbd1e1f86f340d63afe03ad2f9691d166810837cc5988a24ad4f323"} Feb 24 10:21:39 crc kubenswrapper[4985]: I0224 10:21:39.315574 4985 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"97f91c2ddd00ceb58ea5691c7cb057bd7cfe8add5fd1dfd334f39b9def30db0d"} Feb 24 10:21:39 crc kubenswrapper[4985]: I0224 10:21:39.315579 4985 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"e04b766ff2b4eb366ace7cbf978ef84bfa922494ec3d28f4cbd75efe1cb54f1e"} Feb 24 10:21:39 crc kubenswrapper[4985]: I0224 10:21:39.315584 4985 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"7630d2e046fcee78924010a8de1aefcdf7c24ef4e70c4ea89c193983a0d88a61"} Feb 24 10:21:39 crc kubenswrapper[4985]: I0224 10:21:39.315590 4985 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"8de70126733b3634f4e3c1576c3095c089e9628a6147ce8ea79fa5242a2eb920"} Feb 24 10:21:39 crc kubenswrapper[4985]: I0224 10:21:39.315606 4985 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-27dpt" event={"ID":"1b3986ef-e9be-43db-9350-ccc7dd3f713f","Type":"ContainerDied","Data":"e04b766ff2b4eb366ace7cbf978ef84bfa922494ec3d28f4cbd75efe1cb54f1e"} Feb 24 10:21:39 crc kubenswrapper[4985]: I0224 10:21:39.315622 4985 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"0deac0a06dc9dd969ef518376b34b0974f7314f897c1b21a0620a804e1bad55b"} Feb 24 10:21:39 crc kubenswrapper[4985]: I0224 10:21:39.315629 4985 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"df7ed5513fec3c391f8a44d1039dd8a9d0e7c587428383737726821dfab5d0b1"} Feb 24 10:21:39 crc kubenswrapper[4985]: I0224 10:21:39.315634 4985 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"e3faa6c1361aca1d102b6dbfaf38fa3094a5c2334713ce70a8bf2f58f19d6392"} Feb 24 10:21:39 crc kubenswrapper[4985]: I0224 10:21:39.315639 4985 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"3d76cc1e57df7809a67ce69b6690fd8cbe0ef9c364406b03bbb37bd110d65866"} Feb 24 10:21:39 crc kubenswrapper[4985]: I0224 10:21:39.315644 4985 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"cb7b5b1f27320e4bc00e48154c3bb41b1153e36c39a404cfc1d4a246f5909c5d"} Feb 24 10:21:39 crc kubenswrapper[4985]: I0224 10:21:39.315649 4985 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"af211a7b5dbd1e1f86f340d63afe03ad2f9691d166810837cc5988a24ad4f323"} Feb 24 10:21:39 crc kubenswrapper[4985]: I0224 10:21:39.315654 4985 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"97f91c2ddd00ceb58ea5691c7cb057bd7cfe8add5fd1dfd334f39b9def30db0d"} Feb 24 10:21:39 crc kubenswrapper[4985]: I0224 10:21:39.315658 4985 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"e04b766ff2b4eb366ace7cbf978ef84bfa922494ec3d28f4cbd75efe1cb54f1e"} Feb 24 10:21:39 crc kubenswrapper[4985]: I0224 10:21:39.315663 4985 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"7630d2e046fcee78924010a8de1aefcdf7c24ef4e70c4ea89c193983a0d88a61"} Feb 24 10:21:39 crc kubenswrapper[4985]: I0224 10:21:39.315668 4985 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"8de70126733b3634f4e3c1576c3095c089e9628a6147ce8ea79fa5242a2eb920"} Feb 24 10:21:39 crc kubenswrapper[4985]: I0224 10:21:39.315674 4985 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-27dpt" event={"ID":"1b3986ef-e9be-43db-9350-ccc7dd3f713f","Type":"ContainerDied","Data":"7630d2e046fcee78924010a8de1aefcdf7c24ef4e70c4ea89c193983a0d88a61"} Feb 24 10:21:39 crc kubenswrapper[4985]: I0224 10:21:39.315682 4985 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"0deac0a06dc9dd969ef518376b34b0974f7314f897c1b21a0620a804e1bad55b"} Feb 24 10:21:39 crc kubenswrapper[4985]: I0224 10:21:39.315688 4985 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"df7ed5513fec3c391f8a44d1039dd8a9d0e7c587428383737726821dfab5d0b1"} Feb 24 10:21:39 crc kubenswrapper[4985]: I0224 10:21:39.315693 4985 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"e3faa6c1361aca1d102b6dbfaf38fa3094a5c2334713ce70a8bf2f58f19d6392"} Feb 24 10:21:39 crc kubenswrapper[4985]: I0224 10:21:39.315698 4985 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"3d76cc1e57df7809a67ce69b6690fd8cbe0ef9c364406b03bbb37bd110d65866"} Feb 24 10:21:39 crc kubenswrapper[4985]: I0224 10:21:39.315703 4985 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"cb7b5b1f27320e4bc00e48154c3bb41b1153e36c39a404cfc1d4a246f5909c5d"} Feb 24 10:21:39 crc kubenswrapper[4985]: I0224 10:21:39.315708 4985 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"af211a7b5dbd1e1f86f340d63afe03ad2f9691d166810837cc5988a24ad4f323"} Feb 24 10:21:39 crc kubenswrapper[4985]: I0224 10:21:39.315713 4985 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"97f91c2ddd00ceb58ea5691c7cb057bd7cfe8add5fd1dfd334f39b9def30db0d"} Feb 24 10:21:39 crc kubenswrapper[4985]: I0224 10:21:39.315721 4985 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"e04b766ff2b4eb366ace7cbf978ef84bfa922494ec3d28f4cbd75efe1cb54f1e"} Feb 24 10:21:39 crc kubenswrapper[4985]: I0224 10:21:39.315726 4985 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"7630d2e046fcee78924010a8de1aefcdf7c24ef4e70c4ea89c193983a0d88a61"} Feb 24 10:21:39 crc kubenswrapper[4985]: I0224 10:21:39.315731 4985 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"8de70126733b3634f4e3c1576c3095c089e9628a6147ce8ea79fa5242a2eb920"} Feb 24 10:21:39 crc kubenswrapper[4985]: I0224 10:21:39.315738 4985 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-27dpt" event={"ID":"1b3986ef-e9be-43db-9350-ccc7dd3f713f","Type":"ContainerDied","Data":"4eeee5c90be69f13ae3c1ebb0228649f43f2c4c7a73100d1c9ccefe388fd4555"} Feb 24 10:21:39 crc kubenswrapper[4985]: I0224 10:21:39.315747 4985 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"0deac0a06dc9dd969ef518376b34b0974f7314f897c1b21a0620a804e1bad55b"} Feb 24 10:21:39 crc kubenswrapper[4985]: I0224 10:21:39.315753 4985 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"df7ed5513fec3c391f8a44d1039dd8a9d0e7c587428383737726821dfab5d0b1"} Feb 24 10:21:39 crc kubenswrapper[4985]: I0224 10:21:39.315757 4985 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"e3faa6c1361aca1d102b6dbfaf38fa3094a5c2334713ce70a8bf2f58f19d6392"} Feb 24 10:21:39 crc kubenswrapper[4985]: I0224 10:21:39.315762 4985 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"3d76cc1e57df7809a67ce69b6690fd8cbe0ef9c364406b03bbb37bd110d65866"} Feb 24 10:21:39 crc kubenswrapper[4985]: I0224 10:21:39.315768 4985 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"cb7b5b1f27320e4bc00e48154c3bb41b1153e36c39a404cfc1d4a246f5909c5d"} Feb 24 10:21:39 crc kubenswrapper[4985]: I0224 10:21:39.315773 4985 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"af211a7b5dbd1e1f86f340d63afe03ad2f9691d166810837cc5988a24ad4f323"} Feb 24 10:21:39 crc kubenswrapper[4985]: I0224 10:21:39.315778 4985 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"97f91c2ddd00ceb58ea5691c7cb057bd7cfe8add5fd1dfd334f39b9def30db0d"} Feb 24 10:21:39 crc kubenswrapper[4985]: I0224 10:21:39.315783 4985 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"e04b766ff2b4eb366ace7cbf978ef84bfa922494ec3d28f4cbd75efe1cb54f1e"} Feb 24 10:21:39 crc kubenswrapper[4985]: I0224 10:21:39.315789 4985 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"7630d2e046fcee78924010a8de1aefcdf7c24ef4e70c4ea89c193983a0d88a61"} Feb 24 10:21:39 crc kubenswrapper[4985]: I0224 10:21:39.315794 4985 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"8de70126733b3634f4e3c1576c3095c089e9628a6147ce8ea79fa5242a2eb920"} Feb 24 10:21:39 crc kubenswrapper[4985]: I0224 10:21:39.318776 4985 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-q24bf_731349d2-7b07-4bc9-81f8-c7d75bca842a/kube-multus/2.log" Feb 24 10:21:39 crc kubenswrapper[4985]: I0224 10:21:39.319595 4985 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-q24bf_731349d2-7b07-4bc9-81f8-c7d75bca842a/kube-multus/1.log" Feb 24 10:21:39 crc kubenswrapper[4985]: I0224 10:21:39.319619 4985 generic.go:334] "Generic (PLEG): container finished" podID="731349d2-7b07-4bc9-81f8-c7d75bca842a" containerID="45ef150d9f586a382ad3aad2d1f3a60b3f6c286f0151c6dd77de5400353fb59a" exitCode=2 Feb 24 10:21:39 crc kubenswrapper[4985]: I0224 10:21:39.319665 4985 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-q24bf" event={"ID":"731349d2-7b07-4bc9-81f8-c7d75bca842a","Type":"ContainerDied","Data":"45ef150d9f586a382ad3aad2d1f3a60b3f6c286f0151c6dd77de5400353fb59a"} Feb 24 10:21:39 crc kubenswrapper[4985]: I0224 10:21:39.319739 4985 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"9ab2ed5b6b7b76ded6468be8fb5375bccf9f79a9b916fd1cc4fc2bc192140eb4"} Feb 24 10:21:39 crc kubenswrapper[4985]: I0224 10:21:39.320391 4985 scope.go:117] "RemoveContainer" containerID="45ef150d9f586a382ad3aad2d1f3a60b3f6c286f0151c6dd77de5400353fb59a" Feb 24 10:21:39 crc kubenswrapper[4985]: E0224 10:21:39.320664 4985 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-multus pod=multus-q24bf_openshift-multus(731349d2-7b07-4bc9-81f8-c7d75bca842a)\"" pod="openshift-multus/multus-q24bf" podUID="731349d2-7b07-4bc9-81f8-c7d75bca842a" Feb 24 10:21:39 crc kubenswrapper[4985]: I0224 10:21:39.321136 4985 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mlwhn" event={"ID":"276d922d-146f-4dac-b739-ba187cb0e0bc","Type":"ContainerStarted","Data":"a6aa5357df19e4f33fb00be38dc6281dc473fd1fc2851f1f3965f47d2cd737be"} Feb 24 10:21:39 crc kubenswrapper[4985]: I0224 10:21:39.347099 4985 scope.go:117] "RemoveContainer" containerID="df7ed5513fec3c391f8a44d1039dd8a9d0e7c587428383737726821dfab5d0b1" Feb 24 10:21:39 crc kubenswrapper[4985]: I0224 10:21:39.420595 4985 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-27dpt"] Feb 24 10:21:39 crc kubenswrapper[4985]: I0224 10:21:39.421125 4985 scope.go:117] "RemoveContainer" containerID="e3faa6c1361aca1d102b6dbfaf38fa3094a5c2334713ce70a8bf2f58f19d6392" Feb 24 10:21:39 crc kubenswrapper[4985]: I0224 10:21:39.425381 4985 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-27dpt"] Feb 24 10:21:39 crc kubenswrapper[4985]: I0224 10:21:39.434505 4985 scope.go:117] "RemoveContainer" containerID="3d76cc1e57df7809a67ce69b6690fd8cbe0ef9c364406b03bbb37bd110d65866" Feb 24 10:21:39 crc kubenswrapper[4985]: I0224 10:21:39.450230 4985 scope.go:117] "RemoveContainer" containerID="cb7b5b1f27320e4bc00e48154c3bb41b1153e36c39a404cfc1d4a246f5909c5d" Feb 24 10:21:39 crc kubenswrapper[4985]: I0224 10:21:39.482864 4985 scope.go:117] "RemoveContainer" containerID="af211a7b5dbd1e1f86f340d63afe03ad2f9691d166810837cc5988a24ad4f323" Feb 24 10:21:39 crc kubenswrapper[4985]: I0224 10:21:39.503449 4985 scope.go:117] "RemoveContainer" containerID="97f91c2ddd00ceb58ea5691c7cb057bd7cfe8add5fd1dfd334f39b9def30db0d" Feb 24 10:21:39 crc kubenswrapper[4985]: I0224 10:21:39.521792 4985 scope.go:117] "RemoveContainer" containerID="e04b766ff2b4eb366ace7cbf978ef84bfa922494ec3d28f4cbd75efe1cb54f1e" Feb 24 10:21:39 crc kubenswrapper[4985]: I0224 10:21:39.533719 4985 scope.go:117] "RemoveContainer" containerID="7630d2e046fcee78924010a8de1aefcdf7c24ef4e70c4ea89c193983a0d88a61" Feb 24 10:21:39 crc kubenswrapper[4985]: I0224 10:21:39.551281 4985 scope.go:117] "RemoveContainer" containerID="8de70126733b3634f4e3c1576c3095c089e9628a6147ce8ea79fa5242a2eb920" Feb 24 10:21:39 crc kubenswrapper[4985]: I0224 10:21:39.566981 4985 scope.go:117] "RemoveContainer" containerID="0deac0a06dc9dd969ef518376b34b0974f7314f897c1b21a0620a804e1bad55b" Feb 24 10:21:39 crc kubenswrapper[4985]: E0224 10:21:39.567368 4985 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0deac0a06dc9dd969ef518376b34b0974f7314f897c1b21a0620a804e1bad55b\": container with ID starting with 0deac0a06dc9dd969ef518376b34b0974f7314f897c1b21a0620a804e1bad55b not found: ID does not exist" containerID="0deac0a06dc9dd969ef518376b34b0974f7314f897c1b21a0620a804e1bad55b" Feb 24 10:21:39 crc kubenswrapper[4985]: I0224 10:21:39.567412 4985 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0deac0a06dc9dd969ef518376b34b0974f7314f897c1b21a0620a804e1bad55b"} err="failed to get container status \"0deac0a06dc9dd969ef518376b34b0974f7314f897c1b21a0620a804e1bad55b\": rpc error: code = NotFound desc = could not find container \"0deac0a06dc9dd969ef518376b34b0974f7314f897c1b21a0620a804e1bad55b\": container with ID starting with 0deac0a06dc9dd969ef518376b34b0974f7314f897c1b21a0620a804e1bad55b not found: ID does not exist" Feb 24 10:21:39 crc kubenswrapper[4985]: I0224 10:21:39.567436 4985 scope.go:117] "RemoveContainer" containerID="df7ed5513fec3c391f8a44d1039dd8a9d0e7c587428383737726821dfab5d0b1" Feb 24 10:21:39 crc kubenswrapper[4985]: E0224 10:21:39.567683 4985 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"df7ed5513fec3c391f8a44d1039dd8a9d0e7c587428383737726821dfab5d0b1\": container with ID starting with df7ed5513fec3c391f8a44d1039dd8a9d0e7c587428383737726821dfab5d0b1 not found: ID does not exist" containerID="df7ed5513fec3c391f8a44d1039dd8a9d0e7c587428383737726821dfab5d0b1" Feb 24 10:21:39 crc kubenswrapper[4985]: I0224 10:21:39.567713 4985 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"df7ed5513fec3c391f8a44d1039dd8a9d0e7c587428383737726821dfab5d0b1"} err="failed to get container status \"df7ed5513fec3c391f8a44d1039dd8a9d0e7c587428383737726821dfab5d0b1\": rpc error: code = NotFound desc = could not find container \"df7ed5513fec3c391f8a44d1039dd8a9d0e7c587428383737726821dfab5d0b1\": container with ID starting with df7ed5513fec3c391f8a44d1039dd8a9d0e7c587428383737726821dfab5d0b1 not found: ID does not exist" Feb 24 10:21:39 crc kubenswrapper[4985]: I0224 10:21:39.567739 4985 scope.go:117] "RemoveContainer" containerID="e3faa6c1361aca1d102b6dbfaf38fa3094a5c2334713ce70a8bf2f58f19d6392" Feb 24 10:21:39 crc kubenswrapper[4985]: E0224 10:21:39.567971 4985 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e3faa6c1361aca1d102b6dbfaf38fa3094a5c2334713ce70a8bf2f58f19d6392\": container with ID starting with e3faa6c1361aca1d102b6dbfaf38fa3094a5c2334713ce70a8bf2f58f19d6392 not found: ID does not exist" containerID="e3faa6c1361aca1d102b6dbfaf38fa3094a5c2334713ce70a8bf2f58f19d6392" Feb 24 10:21:39 crc kubenswrapper[4985]: I0224 10:21:39.567998 4985 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e3faa6c1361aca1d102b6dbfaf38fa3094a5c2334713ce70a8bf2f58f19d6392"} err="failed to get container status \"e3faa6c1361aca1d102b6dbfaf38fa3094a5c2334713ce70a8bf2f58f19d6392\": rpc error: code = NotFound desc = could not find container \"e3faa6c1361aca1d102b6dbfaf38fa3094a5c2334713ce70a8bf2f58f19d6392\": container with ID starting with e3faa6c1361aca1d102b6dbfaf38fa3094a5c2334713ce70a8bf2f58f19d6392 not found: ID does not exist" Feb 24 10:21:39 crc kubenswrapper[4985]: I0224 10:21:39.568011 4985 scope.go:117] "RemoveContainer" containerID="3d76cc1e57df7809a67ce69b6690fd8cbe0ef9c364406b03bbb37bd110d65866" Feb 24 10:21:39 crc kubenswrapper[4985]: E0224 10:21:39.568391 4985 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3d76cc1e57df7809a67ce69b6690fd8cbe0ef9c364406b03bbb37bd110d65866\": container with ID starting with 3d76cc1e57df7809a67ce69b6690fd8cbe0ef9c364406b03bbb37bd110d65866 not found: ID does not exist" containerID="3d76cc1e57df7809a67ce69b6690fd8cbe0ef9c364406b03bbb37bd110d65866" Feb 24 10:21:39 crc kubenswrapper[4985]: I0224 10:21:39.568412 4985 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3d76cc1e57df7809a67ce69b6690fd8cbe0ef9c364406b03bbb37bd110d65866"} err="failed to get container status \"3d76cc1e57df7809a67ce69b6690fd8cbe0ef9c364406b03bbb37bd110d65866\": rpc error: code = NotFound desc = could not find container \"3d76cc1e57df7809a67ce69b6690fd8cbe0ef9c364406b03bbb37bd110d65866\": container with ID starting with 3d76cc1e57df7809a67ce69b6690fd8cbe0ef9c364406b03bbb37bd110d65866 not found: ID does not exist" Feb 24 10:21:39 crc kubenswrapper[4985]: I0224 10:21:39.568432 4985 scope.go:117] "RemoveContainer" containerID="cb7b5b1f27320e4bc00e48154c3bb41b1153e36c39a404cfc1d4a246f5909c5d" Feb 24 10:21:39 crc kubenswrapper[4985]: E0224 10:21:39.568663 4985 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cb7b5b1f27320e4bc00e48154c3bb41b1153e36c39a404cfc1d4a246f5909c5d\": container with ID starting with cb7b5b1f27320e4bc00e48154c3bb41b1153e36c39a404cfc1d4a246f5909c5d not found: ID does not exist" containerID="cb7b5b1f27320e4bc00e48154c3bb41b1153e36c39a404cfc1d4a246f5909c5d" Feb 24 10:21:39 crc kubenswrapper[4985]: I0224 10:21:39.568695 4985 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cb7b5b1f27320e4bc00e48154c3bb41b1153e36c39a404cfc1d4a246f5909c5d"} err="failed to get container status \"cb7b5b1f27320e4bc00e48154c3bb41b1153e36c39a404cfc1d4a246f5909c5d\": rpc error: code = NotFound desc = could not find container \"cb7b5b1f27320e4bc00e48154c3bb41b1153e36c39a404cfc1d4a246f5909c5d\": container with ID starting with cb7b5b1f27320e4bc00e48154c3bb41b1153e36c39a404cfc1d4a246f5909c5d not found: ID does not exist" Feb 24 10:21:39 crc kubenswrapper[4985]: I0224 10:21:39.568711 4985 scope.go:117] "RemoveContainer" containerID="af211a7b5dbd1e1f86f340d63afe03ad2f9691d166810837cc5988a24ad4f323" Feb 24 10:21:39 crc kubenswrapper[4985]: E0224 10:21:39.569049 4985 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"af211a7b5dbd1e1f86f340d63afe03ad2f9691d166810837cc5988a24ad4f323\": container with ID starting with af211a7b5dbd1e1f86f340d63afe03ad2f9691d166810837cc5988a24ad4f323 not found: ID does not exist" containerID="af211a7b5dbd1e1f86f340d63afe03ad2f9691d166810837cc5988a24ad4f323" Feb 24 10:21:39 crc kubenswrapper[4985]: I0224 10:21:39.569070 4985 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"af211a7b5dbd1e1f86f340d63afe03ad2f9691d166810837cc5988a24ad4f323"} err="failed to get container status \"af211a7b5dbd1e1f86f340d63afe03ad2f9691d166810837cc5988a24ad4f323\": rpc error: code = NotFound desc = could not find container \"af211a7b5dbd1e1f86f340d63afe03ad2f9691d166810837cc5988a24ad4f323\": container with ID starting with af211a7b5dbd1e1f86f340d63afe03ad2f9691d166810837cc5988a24ad4f323 not found: ID does not exist" Feb 24 10:21:39 crc kubenswrapper[4985]: I0224 10:21:39.569088 4985 scope.go:117] "RemoveContainer" containerID="97f91c2ddd00ceb58ea5691c7cb057bd7cfe8add5fd1dfd334f39b9def30db0d" Feb 24 10:21:39 crc kubenswrapper[4985]: E0224 10:21:39.569458 4985 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"97f91c2ddd00ceb58ea5691c7cb057bd7cfe8add5fd1dfd334f39b9def30db0d\": container with ID starting with 97f91c2ddd00ceb58ea5691c7cb057bd7cfe8add5fd1dfd334f39b9def30db0d not found: ID does not exist" containerID="97f91c2ddd00ceb58ea5691c7cb057bd7cfe8add5fd1dfd334f39b9def30db0d" Feb 24 10:21:39 crc kubenswrapper[4985]: I0224 10:21:39.569476 4985 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"97f91c2ddd00ceb58ea5691c7cb057bd7cfe8add5fd1dfd334f39b9def30db0d"} err="failed to get container status \"97f91c2ddd00ceb58ea5691c7cb057bd7cfe8add5fd1dfd334f39b9def30db0d\": rpc error: code = NotFound desc = could not find container \"97f91c2ddd00ceb58ea5691c7cb057bd7cfe8add5fd1dfd334f39b9def30db0d\": container with ID starting with 97f91c2ddd00ceb58ea5691c7cb057bd7cfe8add5fd1dfd334f39b9def30db0d not found: ID does not exist" Feb 24 10:21:39 crc kubenswrapper[4985]: I0224 10:21:39.569488 4985 scope.go:117] "RemoveContainer" containerID="e04b766ff2b4eb366ace7cbf978ef84bfa922494ec3d28f4cbd75efe1cb54f1e" Feb 24 10:21:39 crc kubenswrapper[4985]: E0224 10:21:39.569785 4985 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e04b766ff2b4eb366ace7cbf978ef84bfa922494ec3d28f4cbd75efe1cb54f1e\": container with ID starting with e04b766ff2b4eb366ace7cbf978ef84bfa922494ec3d28f4cbd75efe1cb54f1e not found: ID does not exist" containerID="e04b766ff2b4eb366ace7cbf978ef84bfa922494ec3d28f4cbd75efe1cb54f1e" Feb 24 10:21:39 crc kubenswrapper[4985]: I0224 10:21:39.569806 4985 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e04b766ff2b4eb366ace7cbf978ef84bfa922494ec3d28f4cbd75efe1cb54f1e"} err="failed to get container status \"e04b766ff2b4eb366ace7cbf978ef84bfa922494ec3d28f4cbd75efe1cb54f1e\": rpc error: code = NotFound desc = could not find container \"e04b766ff2b4eb366ace7cbf978ef84bfa922494ec3d28f4cbd75efe1cb54f1e\": container with ID starting with e04b766ff2b4eb366ace7cbf978ef84bfa922494ec3d28f4cbd75efe1cb54f1e not found: ID does not exist" Feb 24 10:21:39 crc kubenswrapper[4985]: I0224 10:21:39.569828 4985 scope.go:117] "RemoveContainer" containerID="7630d2e046fcee78924010a8de1aefcdf7c24ef4e70c4ea89c193983a0d88a61" Feb 24 10:21:39 crc kubenswrapper[4985]: E0224 10:21:39.570082 4985 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7630d2e046fcee78924010a8de1aefcdf7c24ef4e70c4ea89c193983a0d88a61\": container with ID starting with 7630d2e046fcee78924010a8de1aefcdf7c24ef4e70c4ea89c193983a0d88a61 not found: ID does not exist" containerID="7630d2e046fcee78924010a8de1aefcdf7c24ef4e70c4ea89c193983a0d88a61" Feb 24 10:21:39 crc kubenswrapper[4985]: I0224 10:21:39.570102 4985 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7630d2e046fcee78924010a8de1aefcdf7c24ef4e70c4ea89c193983a0d88a61"} err="failed to get container status \"7630d2e046fcee78924010a8de1aefcdf7c24ef4e70c4ea89c193983a0d88a61\": rpc error: code = NotFound desc = could not find container \"7630d2e046fcee78924010a8de1aefcdf7c24ef4e70c4ea89c193983a0d88a61\": container with ID starting with 7630d2e046fcee78924010a8de1aefcdf7c24ef4e70c4ea89c193983a0d88a61 not found: ID does not exist" Feb 24 10:21:39 crc kubenswrapper[4985]: I0224 10:21:39.570114 4985 scope.go:117] "RemoveContainer" containerID="8de70126733b3634f4e3c1576c3095c089e9628a6147ce8ea79fa5242a2eb920" Feb 24 10:21:39 crc kubenswrapper[4985]: E0224 10:21:39.570322 4985 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8de70126733b3634f4e3c1576c3095c089e9628a6147ce8ea79fa5242a2eb920\": container with ID starting with 8de70126733b3634f4e3c1576c3095c089e9628a6147ce8ea79fa5242a2eb920 not found: ID does not exist" containerID="8de70126733b3634f4e3c1576c3095c089e9628a6147ce8ea79fa5242a2eb920" Feb 24 10:21:39 crc kubenswrapper[4985]: I0224 10:21:39.570339 4985 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8de70126733b3634f4e3c1576c3095c089e9628a6147ce8ea79fa5242a2eb920"} err="failed to get container status \"8de70126733b3634f4e3c1576c3095c089e9628a6147ce8ea79fa5242a2eb920\": rpc error: code = NotFound desc = could not find container \"8de70126733b3634f4e3c1576c3095c089e9628a6147ce8ea79fa5242a2eb920\": container with ID starting with 8de70126733b3634f4e3c1576c3095c089e9628a6147ce8ea79fa5242a2eb920 not found: ID does not exist" Feb 24 10:21:39 crc kubenswrapper[4985]: I0224 10:21:39.570349 4985 scope.go:117] "RemoveContainer" containerID="0deac0a06dc9dd969ef518376b34b0974f7314f897c1b21a0620a804e1bad55b" Feb 24 10:21:39 crc kubenswrapper[4985]: I0224 10:21:39.570546 4985 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0deac0a06dc9dd969ef518376b34b0974f7314f897c1b21a0620a804e1bad55b"} err="failed to get container status \"0deac0a06dc9dd969ef518376b34b0974f7314f897c1b21a0620a804e1bad55b\": rpc error: code = NotFound desc = could not find container \"0deac0a06dc9dd969ef518376b34b0974f7314f897c1b21a0620a804e1bad55b\": container with ID starting with 0deac0a06dc9dd969ef518376b34b0974f7314f897c1b21a0620a804e1bad55b not found: ID does not exist" Feb 24 10:21:39 crc kubenswrapper[4985]: I0224 10:21:39.570569 4985 scope.go:117] "RemoveContainer" containerID="df7ed5513fec3c391f8a44d1039dd8a9d0e7c587428383737726821dfab5d0b1" Feb 24 10:21:39 crc kubenswrapper[4985]: I0224 10:21:39.570824 4985 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"df7ed5513fec3c391f8a44d1039dd8a9d0e7c587428383737726821dfab5d0b1"} err="failed to get container status \"df7ed5513fec3c391f8a44d1039dd8a9d0e7c587428383737726821dfab5d0b1\": rpc error: code = NotFound desc = could not find container \"df7ed5513fec3c391f8a44d1039dd8a9d0e7c587428383737726821dfab5d0b1\": container with ID starting with df7ed5513fec3c391f8a44d1039dd8a9d0e7c587428383737726821dfab5d0b1 not found: ID does not exist" Feb 24 10:21:39 crc kubenswrapper[4985]: I0224 10:21:39.570840 4985 scope.go:117] "RemoveContainer" containerID="e3faa6c1361aca1d102b6dbfaf38fa3094a5c2334713ce70a8bf2f58f19d6392" Feb 24 10:21:39 crc kubenswrapper[4985]: I0224 10:21:39.571114 4985 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e3faa6c1361aca1d102b6dbfaf38fa3094a5c2334713ce70a8bf2f58f19d6392"} err="failed to get container status \"e3faa6c1361aca1d102b6dbfaf38fa3094a5c2334713ce70a8bf2f58f19d6392\": rpc error: code = NotFound desc = could not find container \"e3faa6c1361aca1d102b6dbfaf38fa3094a5c2334713ce70a8bf2f58f19d6392\": container with ID starting with e3faa6c1361aca1d102b6dbfaf38fa3094a5c2334713ce70a8bf2f58f19d6392 not found: ID does not exist" Feb 24 10:21:39 crc kubenswrapper[4985]: I0224 10:21:39.571132 4985 scope.go:117] "RemoveContainer" containerID="3d76cc1e57df7809a67ce69b6690fd8cbe0ef9c364406b03bbb37bd110d65866" Feb 24 10:21:39 crc kubenswrapper[4985]: I0224 10:21:39.571391 4985 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3d76cc1e57df7809a67ce69b6690fd8cbe0ef9c364406b03bbb37bd110d65866"} err="failed to get container status \"3d76cc1e57df7809a67ce69b6690fd8cbe0ef9c364406b03bbb37bd110d65866\": rpc error: code = NotFound desc = could not find container \"3d76cc1e57df7809a67ce69b6690fd8cbe0ef9c364406b03bbb37bd110d65866\": container with ID starting with 3d76cc1e57df7809a67ce69b6690fd8cbe0ef9c364406b03bbb37bd110d65866 not found: ID does not exist" Feb 24 10:21:39 crc kubenswrapper[4985]: I0224 10:21:39.571413 4985 scope.go:117] "RemoveContainer" containerID="cb7b5b1f27320e4bc00e48154c3bb41b1153e36c39a404cfc1d4a246f5909c5d" Feb 24 10:21:39 crc kubenswrapper[4985]: I0224 10:21:39.571584 4985 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cb7b5b1f27320e4bc00e48154c3bb41b1153e36c39a404cfc1d4a246f5909c5d"} err="failed to get container status \"cb7b5b1f27320e4bc00e48154c3bb41b1153e36c39a404cfc1d4a246f5909c5d\": rpc error: code = NotFound desc = could not find container \"cb7b5b1f27320e4bc00e48154c3bb41b1153e36c39a404cfc1d4a246f5909c5d\": container with ID starting with cb7b5b1f27320e4bc00e48154c3bb41b1153e36c39a404cfc1d4a246f5909c5d not found: ID does not exist" Feb 24 10:21:39 crc kubenswrapper[4985]: I0224 10:21:39.571599 4985 scope.go:117] "RemoveContainer" containerID="af211a7b5dbd1e1f86f340d63afe03ad2f9691d166810837cc5988a24ad4f323" Feb 24 10:21:39 crc kubenswrapper[4985]: I0224 10:21:39.571827 4985 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"af211a7b5dbd1e1f86f340d63afe03ad2f9691d166810837cc5988a24ad4f323"} err="failed to get container status \"af211a7b5dbd1e1f86f340d63afe03ad2f9691d166810837cc5988a24ad4f323\": rpc error: code = NotFound desc = could not find container \"af211a7b5dbd1e1f86f340d63afe03ad2f9691d166810837cc5988a24ad4f323\": container with ID starting with af211a7b5dbd1e1f86f340d63afe03ad2f9691d166810837cc5988a24ad4f323 not found: ID does not exist" Feb 24 10:21:39 crc kubenswrapper[4985]: I0224 10:21:39.571852 4985 scope.go:117] "RemoveContainer" containerID="97f91c2ddd00ceb58ea5691c7cb057bd7cfe8add5fd1dfd334f39b9def30db0d" Feb 24 10:21:39 crc kubenswrapper[4985]: I0224 10:21:39.572083 4985 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"97f91c2ddd00ceb58ea5691c7cb057bd7cfe8add5fd1dfd334f39b9def30db0d"} err="failed to get container status \"97f91c2ddd00ceb58ea5691c7cb057bd7cfe8add5fd1dfd334f39b9def30db0d\": rpc error: code = NotFound desc = could not find container \"97f91c2ddd00ceb58ea5691c7cb057bd7cfe8add5fd1dfd334f39b9def30db0d\": container with ID starting with 97f91c2ddd00ceb58ea5691c7cb057bd7cfe8add5fd1dfd334f39b9def30db0d not found: ID does not exist" Feb 24 10:21:39 crc kubenswrapper[4985]: I0224 10:21:39.572098 4985 scope.go:117] "RemoveContainer" containerID="e04b766ff2b4eb366ace7cbf978ef84bfa922494ec3d28f4cbd75efe1cb54f1e" Feb 24 10:21:39 crc kubenswrapper[4985]: I0224 10:21:39.572367 4985 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e04b766ff2b4eb366ace7cbf978ef84bfa922494ec3d28f4cbd75efe1cb54f1e"} err="failed to get container status \"e04b766ff2b4eb366ace7cbf978ef84bfa922494ec3d28f4cbd75efe1cb54f1e\": rpc error: code = NotFound desc = could not find container \"e04b766ff2b4eb366ace7cbf978ef84bfa922494ec3d28f4cbd75efe1cb54f1e\": container with ID starting with e04b766ff2b4eb366ace7cbf978ef84bfa922494ec3d28f4cbd75efe1cb54f1e not found: ID does not exist" Feb 24 10:21:39 crc kubenswrapper[4985]: I0224 10:21:39.572383 4985 scope.go:117] "RemoveContainer" containerID="7630d2e046fcee78924010a8de1aefcdf7c24ef4e70c4ea89c193983a0d88a61" Feb 24 10:21:39 crc kubenswrapper[4985]: I0224 10:21:39.572575 4985 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7630d2e046fcee78924010a8de1aefcdf7c24ef4e70c4ea89c193983a0d88a61"} err="failed to get container status \"7630d2e046fcee78924010a8de1aefcdf7c24ef4e70c4ea89c193983a0d88a61\": rpc error: code = NotFound desc = could not find container \"7630d2e046fcee78924010a8de1aefcdf7c24ef4e70c4ea89c193983a0d88a61\": container with ID starting with 7630d2e046fcee78924010a8de1aefcdf7c24ef4e70c4ea89c193983a0d88a61 not found: ID does not exist" Feb 24 10:21:39 crc kubenswrapper[4985]: I0224 10:21:39.572594 4985 scope.go:117] "RemoveContainer" containerID="8de70126733b3634f4e3c1576c3095c089e9628a6147ce8ea79fa5242a2eb920" Feb 24 10:21:39 crc kubenswrapper[4985]: I0224 10:21:39.572836 4985 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8de70126733b3634f4e3c1576c3095c089e9628a6147ce8ea79fa5242a2eb920"} err="failed to get container status \"8de70126733b3634f4e3c1576c3095c089e9628a6147ce8ea79fa5242a2eb920\": rpc error: code = NotFound desc = could not find container \"8de70126733b3634f4e3c1576c3095c089e9628a6147ce8ea79fa5242a2eb920\": container with ID starting with 8de70126733b3634f4e3c1576c3095c089e9628a6147ce8ea79fa5242a2eb920 not found: ID does not exist" Feb 24 10:21:39 crc kubenswrapper[4985]: I0224 10:21:39.572852 4985 scope.go:117] "RemoveContainer" containerID="0deac0a06dc9dd969ef518376b34b0974f7314f897c1b21a0620a804e1bad55b" Feb 24 10:21:39 crc kubenswrapper[4985]: I0224 10:21:39.573139 4985 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0deac0a06dc9dd969ef518376b34b0974f7314f897c1b21a0620a804e1bad55b"} err="failed to get container status \"0deac0a06dc9dd969ef518376b34b0974f7314f897c1b21a0620a804e1bad55b\": rpc error: code = NotFound desc = could not find container \"0deac0a06dc9dd969ef518376b34b0974f7314f897c1b21a0620a804e1bad55b\": container with ID starting with 0deac0a06dc9dd969ef518376b34b0974f7314f897c1b21a0620a804e1bad55b not found: ID does not exist" Feb 24 10:21:39 crc kubenswrapper[4985]: I0224 10:21:39.573187 4985 scope.go:117] "RemoveContainer" containerID="df7ed5513fec3c391f8a44d1039dd8a9d0e7c587428383737726821dfab5d0b1" Feb 24 10:21:39 crc kubenswrapper[4985]: I0224 10:21:39.573431 4985 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"df7ed5513fec3c391f8a44d1039dd8a9d0e7c587428383737726821dfab5d0b1"} err="failed to get container status \"df7ed5513fec3c391f8a44d1039dd8a9d0e7c587428383737726821dfab5d0b1\": rpc error: code = NotFound desc = could not find container \"df7ed5513fec3c391f8a44d1039dd8a9d0e7c587428383737726821dfab5d0b1\": container with ID starting with df7ed5513fec3c391f8a44d1039dd8a9d0e7c587428383737726821dfab5d0b1 not found: ID does not exist" Feb 24 10:21:39 crc kubenswrapper[4985]: I0224 10:21:39.573449 4985 scope.go:117] "RemoveContainer" containerID="e3faa6c1361aca1d102b6dbfaf38fa3094a5c2334713ce70a8bf2f58f19d6392" Feb 24 10:21:39 crc kubenswrapper[4985]: I0224 10:21:39.573731 4985 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e3faa6c1361aca1d102b6dbfaf38fa3094a5c2334713ce70a8bf2f58f19d6392"} err="failed to get container status \"e3faa6c1361aca1d102b6dbfaf38fa3094a5c2334713ce70a8bf2f58f19d6392\": rpc error: code = NotFound desc = could not find container \"e3faa6c1361aca1d102b6dbfaf38fa3094a5c2334713ce70a8bf2f58f19d6392\": container with ID starting with e3faa6c1361aca1d102b6dbfaf38fa3094a5c2334713ce70a8bf2f58f19d6392 not found: ID does not exist" Feb 24 10:21:39 crc kubenswrapper[4985]: I0224 10:21:39.573751 4985 scope.go:117] "RemoveContainer" containerID="3d76cc1e57df7809a67ce69b6690fd8cbe0ef9c364406b03bbb37bd110d65866" Feb 24 10:21:39 crc kubenswrapper[4985]: I0224 10:21:39.573950 4985 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3d76cc1e57df7809a67ce69b6690fd8cbe0ef9c364406b03bbb37bd110d65866"} err="failed to get container status \"3d76cc1e57df7809a67ce69b6690fd8cbe0ef9c364406b03bbb37bd110d65866\": rpc error: code = NotFound desc = could not find container \"3d76cc1e57df7809a67ce69b6690fd8cbe0ef9c364406b03bbb37bd110d65866\": container with ID starting with 3d76cc1e57df7809a67ce69b6690fd8cbe0ef9c364406b03bbb37bd110d65866 not found: ID does not exist" Feb 24 10:21:39 crc kubenswrapper[4985]: I0224 10:21:39.573970 4985 scope.go:117] "RemoveContainer" containerID="cb7b5b1f27320e4bc00e48154c3bb41b1153e36c39a404cfc1d4a246f5909c5d" Feb 24 10:21:39 crc kubenswrapper[4985]: I0224 10:21:39.574216 4985 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cb7b5b1f27320e4bc00e48154c3bb41b1153e36c39a404cfc1d4a246f5909c5d"} err="failed to get container status \"cb7b5b1f27320e4bc00e48154c3bb41b1153e36c39a404cfc1d4a246f5909c5d\": rpc error: code = NotFound desc = could not find container \"cb7b5b1f27320e4bc00e48154c3bb41b1153e36c39a404cfc1d4a246f5909c5d\": container with ID starting with cb7b5b1f27320e4bc00e48154c3bb41b1153e36c39a404cfc1d4a246f5909c5d not found: ID does not exist" Feb 24 10:21:39 crc kubenswrapper[4985]: I0224 10:21:39.574235 4985 scope.go:117] "RemoveContainer" containerID="af211a7b5dbd1e1f86f340d63afe03ad2f9691d166810837cc5988a24ad4f323" Feb 24 10:21:39 crc kubenswrapper[4985]: I0224 10:21:39.574525 4985 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"af211a7b5dbd1e1f86f340d63afe03ad2f9691d166810837cc5988a24ad4f323"} err="failed to get container status \"af211a7b5dbd1e1f86f340d63afe03ad2f9691d166810837cc5988a24ad4f323\": rpc error: code = NotFound desc = could not find container \"af211a7b5dbd1e1f86f340d63afe03ad2f9691d166810837cc5988a24ad4f323\": container with ID starting with af211a7b5dbd1e1f86f340d63afe03ad2f9691d166810837cc5988a24ad4f323 not found: ID does not exist" Feb 24 10:21:39 crc kubenswrapper[4985]: I0224 10:21:39.574543 4985 scope.go:117] "RemoveContainer" containerID="97f91c2ddd00ceb58ea5691c7cb057bd7cfe8add5fd1dfd334f39b9def30db0d" Feb 24 10:21:39 crc kubenswrapper[4985]: I0224 10:21:39.574833 4985 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"97f91c2ddd00ceb58ea5691c7cb057bd7cfe8add5fd1dfd334f39b9def30db0d"} err="failed to get container status \"97f91c2ddd00ceb58ea5691c7cb057bd7cfe8add5fd1dfd334f39b9def30db0d\": rpc error: code = NotFound desc = could not find container \"97f91c2ddd00ceb58ea5691c7cb057bd7cfe8add5fd1dfd334f39b9def30db0d\": container with ID starting with 97f91c2ddd00ceb58ea5691c7cb057bd7cfe8add5fd1dfd334f39b9def30db0d not found: ID does not exist" Feb 24 10:21:39 crc kubenswrapper[4985]: I0224 10:21:39.574855 4985 scope.go:117] "RemoveContainer" containerID="e04b766ff2b4eb366ace7cbf978ef84bfa922494ec3d28f4cbd75efe1cb54f1e" Feb 24 10:21:39 crc kubenswrapper[4985]: I0224 10:21:39.575142 4985 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e04b766ff2b4eb366ace7cbf978ef84bfa922494ec3d28f4cbd75efe1cb54f1e"} err="failed to get container status \"e04b766ff2b4eb366ace7cbf978ef84bfa922494ec3d28f4cbd75efe1cb54f1e\": rpc error: code = NotFound desc = could not find container \"e04b766ff2b4eb366ace7cbf978ef84bfa922494ec3d28f4cbd75efe1cb54f1e\": container with ID starting with e04b766ff2b4eb366ace7cbf978ef84bfa922494ec3d28f4cbd75efe1cb54f1e not found: ID does not exist" Feb 24 10:21:39 crc kubenswrapper[4985]: I0224 10:21:39.575164 4985 scope.go:117] "RemoveContainer" containerID="7630d2e046fcee78924010a8de1aefcdf7c24ef4e70c4ea89c193983a0d88a61" Feb 24 10:21:39 crc kubenswrapper[4985]: I0224 10:21:39.575393 4985 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7630d2e046fcee78924010a8de1aefcdf7c24ef4e70c4ea89c193983a0d88a61"} err="failed to get container status \"7630d2e046fcee78924010a8de1aefcdf7c24ef4e70c4ea89c193983a0d88a61\": rpc error: code = NotFound desc = could not find container \"7630d2e046fcee78924010a8de1aefcdf7c24ef4e70c4ea89c193983a0d88a61\": container with ID starting with 7630d2e046fcee78924010a8de1aefcdf7c24ef4e70c4ea89c193983a0d88a61 not found: ID does not exist" Feb 24 10:21:39 crc kubenswrapper[4985]: I0224 10:21:39.575417 4985 scope.go:117] "RemoveContainer" containerID="8de70126733b3634f4e3c1576c3095c089e9628a6147ce8ea79fa5242a2eb920" Feb 24 10:21:39 crc kubenswrapper[4985]: I0224 10:21:39.575644 4985 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8de70126733b3634f4e3c1576c3095c089e9628a6147ce8ea79fa5242a2eb920"} err="failed to get container status \"8de70126733b3634f4e3c1576c3095c089e9628a6147ce8ea79fa5242a2eb920\": rpc error: code = NotFound desc = could not find container \"8de70126733b3634f4e3c1576c3095c089e9628a6147ce8ea79fa5242a2eb920\": container with ID starting with 8de70126733b3634f4e3c1576c3095c089e9628a6147ce8ea79fa5242a2eb920 not found: ID does not exist" Feb 24 10:21:39 crc kubenswrapper[4985]: I0224 10:21:39.575666 4985 scope.go:117] "RemoveContainer" containerID="0deac0a06dc9dd969ef518376b34b0974f7314f897c1b21a0620a804e1bad55b" Feb 24 10:21:39 crc kubenswrapper[4985]: I0224 10:21:39.575899 4985 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0deac0a06dc9dd969ef518376b34b0974f7314f897c1b21a0620a804e1bad55b"} err="failed to get container status \"0deac0a06dc9dd969ef518376b34b0974f7314f897c1b21a0620a804e1bad55b\": rpc error: code = NotFound desc = could not find container \"0deac0a06dc9dd969ef518376b34b0974f7314f897c1b21a0620a804e1bad55b\": container with ID starting with 0deac0a06dc9dd969ef518376b34b0974f7314f897c1b21a0620a804e1bad55b not found: ID does not exist" Feb 24 10:21:39 crc kubenswrapper[4985]: I0224 10:21:39.575922 4985 scope.go:117] "RemoveContainer" containerID="df7ed5513fec3c391f8a44d1039dd8a9d0e7c587428383737726821dfab5d0b1" Feb 24 10:21:39 crc kubenswrapper[4985]: I0224 10:21:39.576152 4985 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"df7ed5513fec3c391f8a44d1039dd8a9d0e7c587428383737726821dfab5d0b1"} err="failed to get container status \"df7ed5513fec3c391f8a44d1039dd8a9d0e7c587428383737726821dfab5d0b1\": rpc error: code = NotFound desc = could not find container \"df7ed5513fec3c391f8a44d1039dd8a9d0e7c587428383737726821dfab5d0b1\": container with ID starting with df7ed5513fec3c391f8a44d1039dd8a9d0e7c587428383737726821dfab5d0b1 not found: ID does not exist" Feb 24 10:21:39 crc kubenswrapper[4985]: I0224 10:21:39.576172 4985 scope.go:117] "RemoveContainer" containerID="e3faa6c1361aca1d102b6dbfaf38fa3094a5c2334713ce70a8bf2f58f19d6392" Feb 24 10:21:39 crc kubenswrapper[4985]: I0224 10:21:39.576410 4985 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e3faa6c1361aca1d102b6dbfaf38fa3094a5c2334713ce70a8bf2f58f19d6392"} err="failed to get container status \"e3faa6c1361aca1d102b6dbfaf38fa3094a5c2334713ce70a8bf2f58f19d6392\": rpc error: code = NotFound desc = could not find container \"e3faa6c1361aca1d102b6dbfaf38fa3094a5c2334713ce70a8bf2f58f19d6392\": container with ID starting with e3faa6c1361aca1d102b6dbfaf38fa3094a5c2334713ce70a8bf2f58f19d6392 not found: ID does not exist" Feb 24 10:21:39 crc kubenswrapper[4985]: I0224 10:21:39.576432 4985 scope.go:117] "RemoveContainer" containerID="3d76cc1e57df7809a67ce69b6690fd8cbe0ef9c364406b03bbb37bd110d65866" Feb 24 10:21:39 crc kubenswrapper[4985]: I0224 10:21:39.576644 4985 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3d76cc1e57df7809a67ce69b6690fd8cbe0ef9c364406b03bbb37bd110d65866"} err="failed to get container status \"3d76cc1e57df7809a67ce69b6690fd8cbe0ef9c364406b03bbb37bd110d65866\": rpc error: code = NotFound desc = could not find container \"3d76cc1e57df7809a67ce69b6690fd8cbe0ef9c364406b03bbb37bd110d65866\": container with ID starting with 3d76cc1e57df7809a67ce69b6690fd8cbe0ef9c364406b03bbb37bd110d65866 not found: ID does not exist" Feb 24 10:21:39 crc kubenswrapper[4985]: I0224 10:21:39.576729 4985 scope.go:117] "RemoveContainer" containerID="cb7b5b1f27320e4bc00e48154c3bb41b1153e36c39a404cfc1d4a246f5909c5d" Feb 24 10:21:39 crc kubenswrapper[4985]: I0224 10:21:39.577017 4985 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cb7b5b1f27320e4bc00e48154c3bb41b1153e36c39a404cfc1d4a246f5909c5d"} err="failed to get container status \"cb7b5b1f27320e4bc00e48154c3bb41b1153e36c39a404cfc1d4a246f5909c5d\": rpc error: code = NotFound desc = could not find container \"cb7b5b1f27320e4bc00e48154c3bb41b1153e36c39a404cfc1d4a246f5909c5d\": container with ID starting with cb7b5b1f27320e4bc00e48154c3bb41b1153e36c39a404cfc1d4a246f5909c5d not found: ID does not exist" Feb 24 10:21:39 crc kubenswrapper[4985]: I0224 10:21:39.577037 4985 scope.go:117] "RemoveContainer" containerID="af211a7b5dbd1e1f86f340d63afe03ad2f9691d166810837cc5988a24ad4f323" Feb 24 10:21:39 crc kubenswrapper[4985]: I0224 10:21:39.577253 4985 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"af211a7b5dbd1e1f86f340d63afe03ad2f9691d166810837cc5988a24ad4f323"} err="failed to get container status \"af211a7b5dbd1e1f86f340d63afe03ad2f9691d166810837cc5988a24ad4f323\": rpc error: code = NotFound desc = could not find container \"af211a7b5dbd1e1f86f340d63afe03ad2f9691d166810837cc5988a24ad4f323\": container with ID starting with af211a7b5dbd1e1f86f340d63afe03ad2f9691d166810837cc5988a24ad4f323 not found: ID does not exist" Feb 24 10:21:39 crc kubenswrapper[4985]: I0224 10:21:39.577352 4985 scope.go:117] "RemoveContainer" containerID="97f91c2ddd00ceb58ea5691c7cb057bd7cfe8add5fd1dfd334f39b9def30db0d" Feb 24 10:21:39 crc kubenswrapper[4985]: I0224 10:21:39.577639 4985 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"97f91c2ddd00ceb58ea5691c7cb057bd7cfe8add5fd1dfd334f39b9def30db0d"} err="failed to get container status \"97f91c2ddd00ceb58ea5691c7cb057bd7cfe8add5fd1dfd334f39b9def30db0d\": rpc error: code = NotFound desc = could not find container \"97f91c2ddd00ceb58ea5691c7cb057bd7cfe8add5fd1dfd334f39b9def30db0d\": container with ID starting with 97f91c2ddd00ceb58ea5691c7cb057bd7cfe8add5fd1dfd334f39b9def30db0d not found: ID does not exist" Feb 24 10:21:39 crc kubenswrapper[4985]: I0224 10:21:39.577687 4985 scope.go:117] "RemoveContainer" containerID="e04b766ff2b4eb366ace7cbf978ef84bfa922494ec3d28f4cbd75efe1cb54f1e" Feb 24 10:21:39 crc kubenswrapper[4985]: I0224 10:21:39.578028 4985 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e04b766ff2b4eb366ace7cbf978ef84bfa922494ec3d28f4cbd75efe1cb54f1e"} err="failed to get container status \"e04b766ff2b4eb366ace7cbf978ef84bfa922494ec3d28f4cbd75efe1cb54f1e\": rpc error: code = NotFound desc = could not find container \"e04b766ff2b4eb366ace7cbf978ef84bfa922494ec3d28f4cbd75efe1cb54f1e\": container with ID starting with e04b766ff2b4eb366ace7cbf978ef84bfa922494ec3d28f4cbd75efe1cb54f1e not found: ID does not exist" Feb 24 10:21:39 crc kubenswrapper[4985]: I0224 10:21:39.578053 4985 scope.go:117] "RemoveContainer" containerID="7630d2e046fcee78924010a8de1aefcdf7c24ef4e70c4ea89c193983a0d88a61" Feb 24 10:21:39 crc kubenswrapper[4985]: I0224 10:21:39.578264 4985 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7630d2e046fcee78924010a8de1aefcdf7c24ef4e70c4ea89c193983a0d88a61"} err="failed to get container status \"7630d2e046fcee78924010a8de1aefcdf7c24ef4e70c4ea89c193983a0d88a61\": rpc error: code = NotFound desc = could not find container \"7630d2e046fcee78924010a8de1aefcdf7c24ef4e70c4ea89c193983a0d88a61\": container with ID starting with 7630d2e046fcee78924010a8de1aefcdf7c24ef4e70c4ea89c193983a0d88a61 not found: ID does not exist" Feb 24 10:21:39 crc kubenswrapper[4985]: I0224 10:21:39.578336 4985 scope.go:117] "RemoveContainer" containerID="8de70126733b3634f4e3c1576c3095c089e9628a6147ce8ea79fa5242a2eb920" Feb 24 10:21:39 crc kubenswrapper[4985]: I0224 10:21:39.578572 4985 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8de70126733b3634f4e3c1576c3095c089e9628a6147ce8ea79fa5242a2eb920"} err="failed to get container status \"8de70126733b3634f4e3c1576c3095c089e9628a6147ce8ea79fa5242a2eb920\": rpc error: code = NotFound desc = could not find container \"8de70126733b3634f4e3c1576c3095c089e9628a6147ce8ea79fa5242a2eb920\": container with ID starting with 8de70126733b3634f4e3c1576c3095c089e9628a6147ce8ea79fa5242a2eb920 not found: ID does not exist" Feb 24 10:21:39 crc kubenswrapper[4985]: I0224 10:21:39.578591 4985 scope.go:117] "RemoveContainer" containerID="0deac0a06dc9dd969ef518376b34b0974f7314f897c1b21a0620a804e1bad55b" Feb 24 10:21:39 crc kubenswrapper[4985]: I0224 10:21:39.578806 4985 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0deac0a06dc9dd969ef518376b34b0974f7314f897c1b21a0620a804e1bad55b"} err="failed to get container status \"0deac0a06dc9dd969ef518376b34b0974f7314f897c1b21a0620a804e1bad55b\": rpc error: code = NotFound desc = could not find container \"0deac0a06dc9dd969ef518376b34b0974f7314f897c1b21a0620a804e1bad55b\": container with ID starting with 0deac0a06dc9dd969ef518376b34b0974f7314f897c1b21a0620a804e1bad55b not found: ID does not exist" Feb 24 10:21:40 crc kubenswrapper[4985]: I0224 10:21:40.275681 4985 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1b3986ef-e9be-43db-9350-ccc7dd3f713f" path="/var/lib/kubelet/pods/1b3986ef-e9be-43db-9350-ccc7dd3f713f/volumes" Feb 24 10:21:40 crc kubenswrapper[4985]: I0224 10:21:40.330644 4985 generic.go:334] "Generic (PLEG): container finished" podID="276d922d-146f-4dac-b739-ba187cb0e0bc" containerID="4a475da411f63106dd26721b0220ca0a62b9126aa54a06f389b75b82f9540295" exitCode=0 Feb 24 10:21:40 crc kubenswrapper[4985]: I0224 10:21:40.330680 4985 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mlwhn" event={"ID":"276d922d-146f-4dac-b739-ba187cb0e0bc","Type":"ContainerDied","Data":"4a475da411f63106dd26721b0220ca0a62b9126aa54a06f389b75b82f9540295"} Feb 24 10:21:41 crc kubenswrapper[4985]: I0224 10:21:41.339586 4985 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mlwhn" event={"ID":"276d922d-146f-4dac-b739-ba187cb0e0bc","Type":"ContainerStarted","Data":"b198a68c21357878ff868d651273d46d88d6d534e550947b8928f4b1b59470d7"} Feb 24 10:21:41 crc kubenswrapper[4985]: I0224 10:21:41.340062 4985 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mlwhn" event={"ID":"276d922d-146f-4dac-b739-ba187cb0e0bc","Type":"ContainerStarted","Data":"2bc08b45c01d823d352675b9b852b68fdbf0a6c47fe7ddaaa5df092d9a073828"} Feb 24 10:21:41 crc kubenswrapper[4985]: I0224 10:21:41.340093 4985 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mlwhn" event={"ID":"276d922d-146f-4dac-b739-ba187cb0e0bc","Type":"ContainerStarted","Data":"da4c0bf6709999c13261cb77f627ff58366d29d8b5d1189b0f3555df6f99dc72"} Feb 24 10:21:41 crc kubenswrapper[4985]: I0224 10:21:41.340107 4985 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mlwhn" event={"ID":"276d922d-146f-4dac-b739-ba187cb0e0bc","Type":"ContainerStarted","Data":"adc110097fb61969bf0cda148e9d897651082c6f089b1c4160cefa41066cc704"} Feb 24 10:21:41 crc kubenswrapper[4985]: I0224 10:21:41.340121 4985 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mlwhn" event={"ID":"276d922d-146f-4dac-b739-ba187cb0e0bc","Type":"ContainerStarted","Data":"3886f646769110e4af5ba46ea770090840d10272b2682514a699af1883f69d9d"} Feb 24 10:21:41 crc kubenswrapper[4985]: I0224 10:21:41.340133 4985 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mlwhn" event={"ID":"276d922d-146f-4dac-b739-ba187cb0e0bc","Type":"ContainerStarted","Data":"2e1975730502bfcc571dfb8db555ec03b9dc03b2b7efa6712bf25082bcd48572"} Feb 24 10:21:43 crc kubenswrapper[4985]: I0224 10:21:43.625148 4985 patch_prober.go:28] interesting pod/machine-config-daemon-hq52w container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 24 10:21:43 crc kubenswrapper[4985]: I0224 10:21:43.625669 4985 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hq52w" podUID="11c1c7b8-18df-4583-849f-76b62544344b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 24 10:21:44 crc kubenswrapper[4985]: I0224 10:21:44.361713 4985 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mlwhn" event={"ID":"276d922d-146f-4dac-b739-ba187cb0e0bc","Type":"ContainerStarted","Data":"fcf663221f9b055d6717bf5783516f343219d680da61e93e24df0fdd0f333de9"} Feb 24 10:21:46 crc kubenswrapper[4985]: I0224 10:21:46.376955 4985 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mlwhn" event={"ID":"276d922d-146f-4dac-b739-ba187cb0e0bc","Type":"ContainerStarted","Data":"1cefc0106e7982353688094ea1218b326d1832195e8c3fd12fcf088e88eb6070"} Feb 24 10:21:46 crc kubenswrapper[4985]: I0224 10:21:46.377702 4985 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-mlwhn" Feb 24 10:21:46 crc kubenswrapper[4985]: I0224 10:21:46.377726 4985 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-mlwhn" Feb 24 10:21:46 crc kubenswrapper[4985]: I0224 10:21:46.419409 4985 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-mlwhn" Feb 24 10:21:46 crc kubenswrapper[4985]: I0224 10:21:46.453502 4985 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-mlwhn" podStartSLOduration=8.453483583 podStartE2EDuration="8.453483583s" podCreationTimestamp="2026-02-24 10:21:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-24 10:21:46.422669239 +0000 UTC m=+790.896861829" watchObservedRunningTime="2026-02-24 10:21:46.453483583 +0000 UTC m=+790.927676153" Feb 24 10:21:47 crc kubenswrapper[4985]: I0224 10:21:47.385758 4985 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-mlwhn" Feb 24 10:21:47 crc kubenswrapper[4985]: I0224 10:21:47.420646 4985 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-mlwhn" Feb 24 10:21:54 crc kubenswrapper[4985]: I0224 10:21:54.264873 4985 scope.go:117] "RemoveContainer" containerID="45ef150d9f586a382ad3aad2d1f3a60b3f6c286f0151c6dd77de5400353fb59a" Feb 24 10:21:54 crc kubenswrapper[4985]: E0224 10:21:54.266280 4985 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-multus pod=multus-q24bf_openshift-multus(731349d2-7b07-4bc9-81f8-c7d75bca842a)\"" pod="openshift-multus/multus-q24bf" podUID="731349d2-7b07-4bc9-81f8-c7d75bca842a" Feb 24 10:21:59 crc kubenswrapper[4985]: I0224 10:21:59.736927 4985 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceph"] Feb 24 10:21:59 crc kubenswrapper[4985]: I0224 10:21:59.738364 4985 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceph" Feb 24 10:21:59 crc kubenswrapper[4985]: I0224 10:21:59.742578 4985 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"kube-root-ca.crt" Feb 24 10:21:59 crc kubenswrapper[4985]: I0224 10:21:59.742886 4985 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openshift-service-ca.crt" Feb 24 10:21:59 crc kubenswrapper[4985]: I0224 10:21:59.743470 4985 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-njxcw" Feb 24 10:21:59 crc kubenswrapper[4985]: I0224 10:21:59.820452 4985 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/empty-dir/5b573685-5737-4089-a679-a14f9a59ef86-run\") pod \"ceph\" (UID: \"5b573685-5737-4089-a679-a14f9a59ef86\") " pod="openstack/ceph" Feb 24 10:21:59 crc kubenswrapper[4985]: I0224 10:21:59.820538 4985 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log\" (UniqueName: \"kubernetes.io/empty-dir/5b573685-5737-4089-a679-a14f9a59ef86-log\") pod \"ceph\" (UID: \"5b573685-5737-4089-a679-a14f9a59ef86\") " pod="openstack/ceph" Feb 24 10:21:59 crc kubenswrapper[4985]: I0224 10:21:59.820619 4985 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2gc98\" (UniqueName: \"kubernetes.io/projected/5b573685-5737-4089-a679-a14f9a59ef86-kube-api-access-2gc98\") pod \"ceph\" (UID: \"5b573685-5737-4089-a679-a14f9a59ef86\") " pod="openstack/ceph" Feb 24 10:21:59 crc kubenswrapper[4985]: I0224 10:21:59.820958 4985 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/5b573685-5737-4089-a679-a14f9a59ef86-data\") pod \"ceph\" (UID: \"5b573685-5737-4089-a679-a14f9a59ef86\") " pod="openstack/ceph" Feb 24 10:21:59 crc kubenswrapper[4985]: I0224 10:21:59.922535 4985 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2gc98\" (UniqueName: \"kubernetes.io/projected/5b573685-5737-4089-a679-a14f9a59ef86-kube-api-access-2gc98\") pod \"ceph\" (UID: \"5b573685-5737-4089-a679-a14f9a59ef86\") " pod="openstack/ceph" Feb 24 10:21:59 crc kubenswrapper[4985]: I0224 10:21:59.922665 4985 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/5b573685-5737-4089-a679-a14f9a59ef86-data\") pod \"ceph\" (UID: \"5b573685-5737-4089-a679-a14f9a59ef86\") " pod="openstack/ceph" Feb 24 10:21:59 crc kubenswrapper[4985]: I0224 10:21:59.922749 4985 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/empty-dir/5b573685-5737-4089-a679-a14f9a59ef86-run\") pod \"ceph\" (UID: \"5b573685-5737-4089-a679-a14f9a59ef86\") " pod="openstack/ceph" Feb 24 10:21:59 crc kubenswrapper[4985]: I0224 10:21:59.922791 4985 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log\" (UniqueName: \"kubernetes.io/empty-dir/5b573685-5737-4089-a679-a14f9a59ef86-log\") pod \"ceph\" (UID: \"5b573685-5737-4089-a679-a14f9a59ef86\") " pod="openstack/ceph" Feb 24 10:21:59 crc kubenswrapper[4985]: I0224 10:21:59.923597 4985 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log\" (UniqueName: \"kubernetes.io/empty-dir/5b573685-5737-4089-a679-a14f9a59ef86-log\") pod \"ceph\" (UID: \"5b573685-5737-4089-a679-a14f9a59ef86\") " pod="openstack/ceph" Feb 24 10:21:59 crc kubenswrapper[4985]: I0224 10:21:59.923672 4985 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/empty-dir/5b573685-5737-4089-a679-a14f9a59ef86-run\") pod \"ceph\" (UID: \"5b573685-5737-4089-a679-a14f9a59ef86\") " pod="openstack/ceph" Feb 24 10:21:59 crc kubenswrapper[4985]: I0224 10:21:59.923833 4985 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/5b573685-5737-4089-a679-a14f9a59ef86-data\") pod \"ceph\" (UID: \"5b573685-5737-4089-a679-a14f9a59ef86\") " pod="openstack/ceph" Feb 24 10:21:59 crc kubenswrapper[4985]: I0224 10:21:59.961731 4985 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2gc98\" (UniqueName: \"kubernetes.io/projected/5b573685-5737-4089-a679-a14f9a59ef86-kube-api-access-2gc98\") pod \"ceph\" (UID: \"5b573685-5737-4089-a679-a14f9a59ef86\") " pod="openstack/ceph" Feb 24 10:22:00 crc kubenswrapper[4985]: I0224 10:22:00.071165 4985 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceph" Feb 24 10:22:00 crc kubenswrapper[4985]: I0224 10:22:00.108704 4985 ???:1] "http: TLS handshake error from 192.168.126.11:59458: no serving certificate available for the kubelet" Feb 24 10:22:00 crc kubenswrapper[4985]: I0224 10:22:00.125878 4985 ???:1] "http: TLS handshake error from 192.168.126.11:59460: no serving certificate available for the kubelet" Feb 24 10:22:00 crc kubenswrapper[4985]: I0224 10:22:00.473688 4985 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceph" event={"ID":"5b573685-5737-4089-a679-a14f9a59ef86","Type":"ContainerStarted","Data":"8ac061395e0e857ca54781b7eaf4eb5b2be350f061a4c9208bd474072d3164c1"} Feb 24 10:22:01 crc kubenswrapper[4985]: I0224 10:22:01.302818 4985 ???:1] "http: TLS handshake error from 192.168.126.11:59464: no serving certificate available for the kubelet" Feb 24 10:22:01 crc kubenswrapper[4985]: I0224 10:22:01.313070 4985 ???:1] "http: TLS handshake error from 192.168.126.11:59474: no serving certificate available for the kubelet" Feb 24 10:22:02 crc kubenswrapper[4985]: I0224 10:22:02.462579 4985 ???:1] "http: TLS handshake error from 192.168.126.11:59476: no serving certificate available for the kubelet" Feb 24 10:22:02 crc kubenswrapper[4985]: I0224 10:22:02.472031 4985 ???:1] "http: TLS handshake error from 192.168.126.11:59488: no serving certificate available for the kubelet" Feb 24 10:22:03 crc kubenswrapper[4985]: I0224 10:22:03.625282 4985 ???:1] "http: TLS handshake error from 192.168.126.11:59504: no serving certificate available for the kubelet" Feb 24 10:22:03 crc kubenswrapper[4985]: I0224 10:22:03.637788 4985 ???:1] "http: TLS handshake error from 192.168.126.11:59516: no serving certificate available for the kubelet" Feb 24 10:22:04 crc kubenswrapper[4985]: I0224 10:22:04.841731 4985 ???:1] "http: TLS handshake error from 192.168.126.11:52576: no serving certificate available for the kubelet" Feb 24 10:22:04 crc kubenswrapper[4985]: I0224 10:22:04.859490 4985 ???:1] "http: TLS handshake error from 192.168.126.11:52592: no serving certificate available for the kubelet" Feb 24 10:22:06 crc kubenswrapper[4985]: I0224 10:22:06.048952 4985 ???:1] "http: TLS handshake error from 192.168.126.11:52596: no serving certificate available for the kubelet" Feb 24 10:22:06 crc kubenswrapper[4985]: I0224 10:22:06.066524 4985 ???:1] "http: TLS handshake error from 192.168.126.11:52604: no serving certificate available for the kubelet" Feb 24 10:22:06 crc kubenswrapper[4985]: I0224 10:22:06.267823 4985 scope.go:117] "RemoveContainer" containerID="45ef150d9f586a382ad3aad2d1f3a60b3f6c286f0151c6dd77de5400353fb59a" Feb 24 10:22:06 crc kubenswrapper[4985]: I0224 10:22:06.512455 4985 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-q24bf_731349d2-7b07-4bc9-81f8-c7d75bca842a/kube-multus/2.log" Feb 24 10:22:06 crc kubenswrapper[4985]: I0224 10:22:06.513214 4985 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-q24bf_731349d2-7b07-4bc9-81f8-c7d75bca842a/kube-multus/1.log" Feb 24 10:22:06 crc kubenswrapper[4985]: I0224 10:22:06.513282 4985 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-q24bf" event={"ID":"731349d2-7b07-4bc9-81f8-c7d75bca842a","Type":"ContainerStarted","Data":"a2677a247f18333b81e3fc56fd23f5cafe5ef01d3a5a9c5a403aecf150ad5c9a"} Feb 24 10:22:06 crc kubenswrapper[4985]: I0224 10:22:06.733546 4985 scope.go:117] "RemoveContainer" containerID="9ab2ed5b6b7b76ded6468be8fb5375bccf9f79a9b916fd1cc4fc2bc192140eb4" Feb 24 10:22:07 crc kubenswrapper[4985]: I0224 10:22:07.199639 4985 ???:1] "http: TLS handshake error from 192.168.126.11:52606: no serving certificate available for the kubelet" Feb 24 10:22:07 crc kubenswrapper[4985]: I0224 10:22:07.213617 4985 ???:1] "http: TLS handshake error from 192.168.126.11:52612: no serving certificate available for the kubelet" Feb 24 10:22:07 crc kubenswrapper[4985]: I0224 10:22:07.519872 4985 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-q24bf_731349d2-7b07-4bc9-81f8-c7d75bca842a/kube-multus/2.log" Feb 24 10:22:08 crc kubenswrapper[4985]: I0224 10:22:08.343230 4985 ???:1] "http: TLS handshake error from 192.168.126.11:52626: no serving certificate available for the kubelet" Feb 24 10:22:08 crc kubenswrapper[4985]: I0224 10:22:08.354772 4985 ???:1] "http: TLS handshake error from 192.168.126.11:52634: no serving certificate available for the kubelet" Feb 24 10:22:09 crc kubenswrapper[4985]: I0224 10:22:09.219983 4985 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-mlwhn" Feb 24 10:22:09 crc kubenswrapper[4985]: I0224 10:22:09.555784 4985 ???:1] "http: TLS handshake error from 192.168.126.11:52650: no serving certificate available for the kubelet" Feb 24 10:22:09 crc kubenswrapper[4985]: I0224 10:22:09.566592 4985 ???:1] "http: TLS handshake error from 192.168.126.11:52658: no serving certificate available for the kubelet" Feb 24 10:22:10 crc kubenswrapper[4985]: I0224 10:22:10.730941 4985 ???:1] "http: TLS handshake error from 192.168.126.11:52670: no serving certificate available for the kubelet" Feb 24 10:22:10 crc kubenswrapper[4985]: I0224 10:22:10.747539 4985 ???:1] "http: TLS handshake error from 192.168.126.11:52682: no serving certificate available for the kubelet" Feb 24 10:22:11 crc kubenswrapper[4985]: I0224 10:22:11.890085 4985 ???:1] "http: TLS handshake error from 192.168.126.11:52686: no serving certificate available for the kubelet" Feb 24 10:22:11 crc kubenswrapper[4985]: I0224 10:22:11.900479 4985 ???:1] "http: TLS handshake error from 192.168.126.11:52694: no serving certificate available for the kubelet" Feb 24 10:22:13 crc kubenswrapper[4985]: I0224 10:22:13.135059 4985 ???:1] "http: TLS handshake error from 192.168.126.11:52710: no serving certificate available for the kubelet" Feb 24 10:22:13 crc kubenswrapper[4985]: I0224 10:22:13.145686 4985 ???:1] "http: TLS handshake error from 192.168.126.11:52720: no serving certificate available for the kubelet" Feb 24 10:22:13 crc kubenswrapper[4985]: I0224 10:22:13.624655 4985 patch_prober.go:28] interesting pod/machine-config-daemon-hq52w container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 24 10:22:13 crc kubenswrapper[4985]: I0224 10:22:13.624718 4985 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hq52w" podUID="11c1c7b8-18df-4583-849f-76b62544344b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 24 10:22:13 crc kubenswrapper[4985]: I0224 10:22:13.624760 4985 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-hq52w" Feb 24 10:22:13 crc kubenswrapper[4985]: I0224 10:22:13.625264 4985 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"5c4f0d0bca61173f77c9ab28eb284d806888a49c4136cca4d749de1d6a14995f"} pod="openshift-machine-config-operator/machine-config-daemon-hq52w" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 24 10:22:13 crc kubenswrapper[4985]: I0224 10:22:13.625333 4985 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-hq52w" podUID="11c1c7b8-18df-4583-849f-76b62544344b" containerName="machine-config-daemon" containerID="cri-o://5c4f0d0bca61173f77c9ab28eb284d806888a49c4136cca4d749de1d6a14995f" gracePeriod=600 Feb 24 10:22:14 crc kubenswrapper[4985]: I0224 10:22:14.305562 4985 ???:1] "http: TLS handshake error from 192.168.126.11:52722: no serving certificate available for the kubelet" Feb 24 10:22:14 crc kubenswrapper[4985]: I0224 10:22:14.314704 4985 ???:1] "http: TLS handshake error from 192.168.126.11:52728: no serving certificate available for the kubelet" Feb 24 10:22:14 crc kubenswrapper[4985]: I0224 10:22:14.564557 4985 generic.go:334] "Generic (PLEG): container finished" podID="11c1c7b8-18df-4583-849f-76b62544344b" containerID="5c4f0d0bca61173f77c9ab28eb284d806888a49c4136cca4d749de1d6a14995f" exitCode=0 Feb 24 10:22:14 crc kubenswrapper[4985]: I0224 10:22:14.564606 4985 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-hq52w" event={"ID":"11c1c7b8-18df-4583-849f-76b62544344b","Type":"ContainerDied","Data":"5c4f0d0bca61173f77c9ab28eb284d806888a49c4136cca4d749de1d6a14995f"} Feb 24 10:22:14 crc kubenswrapper[4985]: I0224 10:22:14.564651 4985 scope.go:117] "RemoveContainer" containerID="362c3bf398ab14b4ae57557da67c3d09f3b4f3cc2e68b6fa0d9b719d68656f2e" Feb 24 10:22:15 crc kubenswrapper[4985]: I0224 10:22:15.438154 4985 ???:1] "http: TLS handshake error from 192.168.126.11:37578: no serving certificate available for the kubelet" Feb 24 10:22:15 crc kubenswrapper[4985]: I0224 10:22:15.454244 4985 ???:1] "http: TLS handshake error from 192.168.126.11:37594: no serving certificate available for the kubelet" Feb 24 10:22:16 crc kubenswrapper[4985]: I0224 10:22:16.677676 4985 ???:1] "http: TLS handshake error from 192.168.126.11:37604: no serving certificate available for the kubelet" Feb 24 10:22:16 crc kubenswrapper[4985]: I0224 10:22:16.694137 4985 ???:1] "http: TLS handshake error from 192.168.126.11:37620: no serving certificate available for the kubelet" Feb 24 10:22:17 crc kubenswrapper[4985]: I0224 10:22:17.841939 4985 ???:1] "http: TLS handshake error from 192.168.126.11:37626: no serving certificate available for the kubelet" Feb 24 10:22:17 crc kubenswrapper[4985]: I0224 10:22:17.854198 4985 ???:1] "http: TLS handshake error from 192.168.126.11:37632: no serving certificate available for the kubelet" Feb 24 10:22:19 crc kubenswrapper[4985]: I0224 10:22:19.017874 4985 ???:1] "http: TLS handshake error from 192.168.126.11:37634: no serving certificate available for the kubelet" Feb 24 10:22:19 crc kubenswrapper[4985]: I0224 10:22:19.033098 4985 ???:1] "http: TLS handshake error from 192.168.126.11:37638: no serving certificate available for the kubelet" Feb 24 10:22:20 crc kubenswrapper[4985]: I0224 10:22:20.212605 4985 ???:1] "http: TLS handshake error from 192.168.126.11:37654: no serving certificate available for the kubelet" Feb 24 10:22:20 crc kubenswrapper[4985]: I0224 10:22:20.231129 4985 ???:1] "http: TLS handshake error from 192.168.126.11:37660: no serving certificate available for the kubelet" Feb 24 10:22:21 crc kubenswrapper[4985]: E0224 10:22:21.144489 4985 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/ceph/demo:latest-squid" Feb 24 10:22:21 crc kubenswrapper[4985]: E0224 10:22:21.145212 4985 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:ceph,Image:quay.io/ceph/demo:latest-squid,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:MON_IP,Value:192.168.126.11,ValueFrom:nil,},EnvVar{Name:CEPH_DAEMON,Value:demo,ValueFrom:nil,},EnvVar{Name:CEPH_PUBLIC_NETWORK,Value:0.0.0.0/0,ValueFrom:nil,},EnvVar{Name:DEMO_DAEMONS,Value:osd,mds,rgw,ValueFrom:nil,},EnvVar{Name:CEPH_DEMO_UID,Value:0,ValueFrom:nil,},EnvVar{Name:RGW_NAME,Value:ceph,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:data,ReadOnly:false,MountPath:/var/lib/ceph,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:log,ReadOnly:false,MountPath:/var/log/ceph,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:run,ReadOnly:false,MountPath:/run/ceph,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-2gc98,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ceph_openstack(5b573685-5737-4089-a679-a14f9a59ef86): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 24 10:22:21 crc kubenswrapper[4985]: E0224 10:22:21.146498 4985 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ceph\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/ceph" podUID="5b573685-5737-4089-a679-a14f9a59ef86" Feb 24 10:22:21 crc kubenswrapper[4985]: I0224 10:22:21.359185 4985 ???:1] "http: TLS handshake error from 192.168.126.11:37666: no serving certificate available for the kubelet" Feb 24 10:22:21 crc kubenswrapper[4985]: I0224 10:22:21.373720 4985 ???:1] "http: TLS handshake error from 192.168.126.11:37674: no serving certificate available for the kubelet" Feb 24 10:22:21 crc kubenswrapper[4985]: I0224 10:22:21.627750 4985 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-hq52w" event={"ID":"11c1c7b8-18df-4583-849f-76b62544344b","Type":"ContainerStarted","Data":"fb077e097f1b7751aafbb136e990ad3a2b3095f18d5ced9f1157f0ad3f65272a"} Feb 24 10:22:21 crc kubenswrapper[4985]: E0224 10:22:21.628167 4985 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ceph\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/ceph/demo:latest-squid\\\"\"" pod="openstack/ceph" podUID="5b573685-5737-4089-a679-a14f9a59ef86" Feb 24 10:22:22 crc kubenswrapper[4985]: I0224 10:22:22.503469 4985 ???:1] "http: TLS handshake error from 192.168.126.11:37690: no serving certificate available for the kubelet" Feb 24 10:22:22 crc kubenswrapper[4985]: I0224 10:22:22.518128 4985 ???:1] "http: TLS handshake error from 192.168.126.11:37696: no serving certificate available for the kubelet" Feb 24 10:22:23 crc kubenswrapper[4985]: I0224 10:22:23.702706 4985 ???:1] "http: TLS handshake error from 192.168.126.11:37708: no serving certificate available for the kubelet" Feb 24 10:22:23 crc kubenswrapper[4985]: I0224 10:22:23.718010 4985 ???:1] "http: TLS handshake error from 192.168.126.11:37722: no serving certificate available for the kubelet" Feb 24 10:22:24 crc kubenswrapper[4985]: I0224 10:22:24.900683 4985 ???:1] "http: TLS handshake error from 192.168.126.11:41410: no serving certificate available for the kubelet" Feb 24 10:22:24 crc kubenswrapper[4985]: I0224 10:22:24.912362 4985 ???:1] "http: TLS handshake error from 192.168.126.11:41420: no serving certificate available for the kubelet" Feb 24 10:22:26 crc kubenswrapper[4985]: I0224 10:22:26.087855 4985 ???:1] "http: TLS handshake error from 192.168.126.11:41422: no serving certificate available for the kubelet" Feb 24 10:22:26 crc kubenswrapper[4985]: I0224 10:22:26.101560 4985 ???:1] "http: TLS handshake error from 192.168.126.11:41426: no serving certificate available for the kubelet" Feb 24 10:22:26 crc kubenswrapper[4985]: I0224 10:22:26.674360 4985 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Feb 24 10:22:27 crc kubenswrapper[4985]: I0224 10:22:27.244031 4985 ???:1] "http: TLS handshake error from 192.168.126.11:41442: no serving certificate available for the kubelet" Feb 24 10:22:27 crc kubenswrapper[4985]: I0224 10:22:27.255771 4985 ???:1] "http: TLS handshake error from 192.168.126.11:41456: no serving certificate available for the kubelet" Feb 24 10:22:28 crc kubenswrapper[4985]: I0224 10:22:28.394037 4985 ???:1] "http: TLS handshake error from 192.168.126.11:41470: no serving certificate available for the kubelet" Feb 24 10:22:28 crc kubenswrapper[4985]: I0224 10:22:28.404507 4985 ???:1] "http: TLS handshake error from 192.168.126.11:41482: no serving certificate available for the kubelet" Feb 24 10:22:29 crc kubenswrapper[4985]: I0224 10:22:29.578266 4985 ???:1] "http: TLS handshake error from 192.168.126.11:41488: no serving certificate available for the kubelet" Feb 24 10:22:29 crc kubenswrapper[4985]: I0224 10:22:29.589711 4985 ???:1] "http: TLS handshake error from 192.168.126.11:41502: no serving certificate available for the kubelet" Feb 24 10:22:30 crc kubenswrapper[4985]: I0224 10:22:30.781567 4985 ???:1] "http: TLS handshake error from 192.168.126.11:41518: no serving certificate available for the kubelet" Feb 24 10:22:30 crc kubenswrapper[4985]: I0224 10:22:30.795474 4985 ???:1] "http: TLS handshake error from 192.168.126.11:41526: no serving certificate available for the kubelet" Feb 24 10:22:31 crc kubenswrapper[4985]: I0224 10:22:31.955475 4985 ???:1] "http: TLS handshake error from 192.168.126.11:41528: no serving certificate available for the kubelet" Feb 24 10:22:31 crc kubenswrapper[4985]: I0224 10:22:31.966516 4985 ???:1] "http: TLS handshake error from 192.168.126.11:41540: no serving certificate available for the kubelet" Feb 24 10:22:33 crc kubenswrapper[4985]: I0224 10:22:33.198213 4985 ???:1] "http: TLS handshake error from 192.168.126.11:41548: no serving certificate available for the kubelet" Feb 24 10:22:33 crc kubenswrapper[4985]: I0224 10:22:33.213748 4985 ???:1] "http: TLS handshake error from 192.168.126.11:41554: no serving certificate available for the kubelet" Feb 24 10:22:33 crc kubenswrapper[4985]: I0224 10:22:33.705024 4985 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceph" event={"ID":"5b573685-5737-4089-a679-a14f9a59ef86","Type":"ContainerStarted","Data":"c69f6dd8a475c92c9681889ee2d0590dde47e0a55fa85016a88c7e11a25dd4be"} Feb 24 10:22:33 crc kubenswrapper[4985]: I0224 10:22:33.729535 4985 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceph" podStartSLOduration=2.049407369 podStartE2EDuration="34.729513369s" podCreationTimestamp="2026-02-24 10:21:59 +0000 UTC" firstStartedPulling="2026-02-24 10:22:00.106520568 +0000 UTC m=+804.580713138" lastFinishedPulling="2026-02-24 10:22:32.786626578 +0000 UTC m=+837.260819138" observedRunningTime="2026-02-24 10:22:33.723480744 +0000 UTC m=+838.197673304" watchObservedRunningTime="2026-02-24 10:22:33.729513369 +0000 UTC m=+838.203705939" Feb 24 10:22:34 crc kubenswrapper[4985]: I0224 10:22:34.420518 4985 ???:1] "http: TLS handshake error from 192.168.126.11:41562: no serving certificate available for the kubelet" Feb 24 10:22:34 crc kubenswrapper[4985]: I0224 10:22:34.433596 4985 ???:1] "http: TLS handshake error from 192.168.126.11:41574: no serving certificate available for the kubelet" Feb 24 10:22:35 crc kubenswrapper[4985]: I0224 10:22:35.578730 4985 ???:1] "http: TLS handshake error from 192.168.126.11:37022: no serving certificate available for the kubelet" Feb 24 10:22:35 crc kubenswrapper[4985]: I0224 10:22:35.727869 4985 ???:1] "http: TLS handshake error from 192.168.126.11:37030: no serving certificate available for the kubelet" Feb 24 10:22:36 crc kubenswrapper[4985]: I0224 10:22:36.891130 4985 ???:1] "http: TLS handshake error from 192.168.126.11:37046: no serving certificate available for the kubelet" Feb 24 10:22:36 crc kubenswrapper[4985]: I0224 10:22:36.904601 4985 ???:1] "http: TLS handshake error from 192.168.126.11:37054: no serving certificate available for the kubelet" Feb 24 10:22:38 crc kubenswrapper[4985]: I0224 10:22:38.048249 4985 ???:1] "http: TLS handshake error from 192.168.126.11:37068: no serving certificate available for the kubelet" Feb 24 10:22:38 crc kubenswrapper[4985]: I0224 10:22:38.062753 4985 ???:1] "http: TLS handshake error from 192.168.126.11:37078: no serving certificate available for the kubelet" Feb 24 10:22:39 crc kubenswrapper[4985]: I0224 10:22:39.243728 4985 ???:1] "http: TLS handshake error from 192.168.126.11:37088: no serving certificate available for the kubelet" Feb 24 10:22:39 crc kubenswrapper[4985]: I0224 10:22:39.282420 4985 ???:1] "http: TLS handshake error from 192.168.126.11:37092: no serving certificate available for the kubelet" Feb 24 10:22:40 crc kubenswrapper[4985]: I0224 10:22:40.492099 4985 ???:1] "http: TLS handshake error from 192.168.126.11:37108: no serving certificate available for the kubelet" Feb 24 10:22:40 crc kubenswrapper[4985]: I0224 10:22:40.509820 4985 ???:1] "http: TLS handshake error from 192.168.126.11:37122: no serving certificate available for the kubelet" Feb 24 10:22:41 crc kubenswrapper[4985]: I0224 10:22:41.736307 4985 ???:1] "http: TLS handshake error from 192.168.126.11:37138: no serving certificate available for the kubelet" Feb 24 10:22:41 crc kubenswrapper[4985]: I0224 10:22:41.756945 4985 ???:1] "http: TLS handshake error from 192.168.126.11:37152: no serving certificate available for the kubelet" Feb 24 10:22:42 crc kubenswrapper[4985]: I0224 10:22:42.926565 4985 ???:1] "http: TLS handshake error from 192.168.126.11:37154: no serving certificate available for the kubelet" Feb 24 10:22:42 crc kubenswrapper[4985]: I0224 10:22:42.942949 4985 ???:1] "http: TLS handshake error from 192.168.126.11:37170: no serving certificate available for the kubelet" Feb 24 10:22:44 crc kubenswrapper[4985]: I0224 10:22:44.111048 4985 ???:1] "http: TLS handshake error from 192.168.126.11:37184: no serving certificate available for the kubelet" Feb 24 10:22:44 crc kubenswrapper[4985]: I0224 10:22:44.122860 4985 ???:1] "http: TLS handshake error from 192.168.126.11:37192: no serving certificate available for the kubelet" Feb 24 10:22:45 crc kubenswrapper[4985]: I0224 10:22:45.359566 4985 ???:1] "http: TLS handshake error from 192.168.126.11:33920: no serving certificate available for the kubelet" Feb 24 10:22:45 crc kubenswrapper[4985]: I0224 10:22:45.378041 4985 ???:1] "http: TLS handshake error from 192.168.126.11:33928: no serving certificate available for the kubelet" Feb 24 10:22:46 crc kubenswrapper[4985]: I0224 10:22:46.569331 4985 ???:1] "http: TLS handshake error from 192.168.126.11:33940: no serving certificate available for the kubelet" Feb 24 10:22:46 crc kubenswrapper[4985]: I0224 10:22:46.587624 4985 ???:1] "http: TLS handshake error from 192.168.126.11:33954: no serving certificate available for the kubelet" Feb 24 10:22:47 crc kubenswrapper[4985]: I0224 10:22:47.808901 4985 ???:1] "http: TLS handshake error from 192.168.126.11:33960: no serving certificate available for the kubelet" Feb 24 10:22:47 crc kubenswrapper[4985]: I0224 10:22:47.825392 4985 ???:1] "http: TLS handshake error from 192.168.126.11:33972: no serving certificate available for the kubelet" Feb 24 10:22:49 crc kubenswrapper[4985]: I0224 10:22:49.017528 4985 ???:1] "http: TLS handshake error from 192.168.126.11:33984: no serving certificate available for the kubelet" Feb 24 10:22:49 crc kubenswrapper[4985]: I0224 10:22:49.033533 4985 ???:1] "http: TLS handshake error from 192.168.126.11:34000: no serving certificate available for the kubelet" Feb 24 10:22:50 crc kubenswrapper[4985]: I0224 10:22:50.190416 4985 ???:1] "http: TLS handshake error from 192.168.126.11:34002: no serving certificate available for the kubelet" Feb 24 10:22:50 crc kubenswrapper[4985]: I0224 10:22:50.209212 4985 ???:1] "http: TLS handshake error from 192.168.126.11:34018: no serving certificate available for the kubelet" Feb 24 10:22:51 crc kubenswrapper[4985]: I0224 10:22:51.344455 4985 ???:1] "http: TLS handshake error from 192.168.126.11:34030: no serving certificate available for the kubelet" Feb 24 10:22:51 crc kubenswrapper[4985]: I0224 10:22:51.355674 4985 ???:1] "http: TLS handshake error from 192.168.126.11:34040: no serving certificate available for the kubelet" Feb 24 10:22:52 crc kubenswrapper[4985]: I0224 10:22:52.560452 4985 ???:1] "http: TLS handshake error from 192.168.126.11:34042: no serving certificate available for the kubelet" Feb 24 10:22:52 crc kubenswrapper[4985]: I0224 10:22:52.576336 4985 ???:1] "http: TLS handshake error from 192.168.126.11:34054: no serving certificate available for the kubelet" Feb 24 10:22:53 crc kubenswrapper[4985]: I0224 10:22:53.761238 4985 ???:1] "http: TLS handshake error from 192.168.126.11:34070: no serving certificate available for the kubelet" Feb 24 10:22:53 crc kubenswrapper[4985]: I0224 10:22:53.773111 4985 ???:1] "http: TLS handshake error from 192.168.126.11:34078: no serving certificate available for the kubelet" Feb 24 10:22:54 crc kubenswrapper[4985]: I0224 10:22:54.988446 4985 ???:1] "http: TLS handshake error from 192.168.126.11:35340: no serving certificate available for the kubelet" Feb 24 10:22:55 crc kubenswrapper[4985]: I0224 10:22:55.007262 4985 ???:1] "http: TLS handshake error from 192.168.126.11:35348: no serving certificate available for the kubelet" Feb 24 10:22:56 crc kubenswrapper[4985]: I0224 10:22:56.201684 4985 ???:1] "http: TLS handshake error from 192.168.126.11:35360: no serving certificate available for the kubelet" Feb 24 10:22:56 crc kubenswrapper[4985]: I0224 10:22:56.213753 4985 ???:1] "http: TLS handshake error from 192.168.126.11:35364: no serving certificate available for the kubelet" Feb 24 10:22:57 crc kubenswrapper[4985]: I0224 10:22:57.362788 4985 ???:1] "http: TLS handshake error from 192.168.126.11:35374: no serving certificate available for the kubelet" Feb 24 10:22:57 crc kubenswrapper[4985]: I0224 10:22:57.376556 4985 ???:1] "http: TLS handshake error from 192.168.126.11:35380: no serving certificate available for the kubelet" Feb 24 10:22:58 crc kubenswrapper[4985]: I0224 10:22:58.501376 4985 ???:1] "http: TLS handshake error from 192.168.126.11:35384: no serving certificate available for the kubelet" Feb 24 10:22:58 crc kubenswrapper[4985]: I0224 10:22:58.517215 4985 ???:1] "http: TLS handshake error from 192.168.126.11:35390: no serving certificate available for the kubelet" Feb 24 10:22:59 crc kubenswrapper[4985]: I0224 10:22:59.714927 4985 ???:1] "http: TLS handshake error from 192.168.126.11:35402: no serving certificate available for the kubelet" Feb 24 10:22:59 crc kubenswrapper[4985]: I0224 10:22:59.733217 4985 ???:1] "http: TLS handshake error from 192.168.126.11:35410: no serving certificate available for the kubelet" Feb 24 10:23:00 crc kubenswrapper[4985]: I0224 10:23:00.913800 4985 ???:1] "http: TLS handshake error from 192.168.126.11:35418: no serving certificate available for the kubelet" Feb 24 10:23:00 crc kubenswrapper[4985]: I0224 10:23:00.928102 4985 ???:1] "http: TLS handshake error from 192.168.126.11:35432: no serving certificate available for the kubelet" Feb 24 10:23:02 crc kubenswrapper[4985]: I0224 10:23:02.124098 4985 ???:1] "http: TLS handshake error from 192.168.126.11:35448: no serving certificate available for the kubelet" Feb 24 10:23:02 crc kubenswrapper[4985]: I0224 10:23:02.138813 4985 ???:1] "http: TLS handshake error from 192.168.126.11:35458: no serving certificate available for the kubelet" Feb 24 10:23:03 crc kubenswrapper[4985]: I0224 10:23:03.294982 4985 ???:1] "http: TLS handshake error from 192.168.126.11:35468: no serving certificate available for the kubelet" Feb 24 10:23:03 crc kubenswrapper[4985]: I0224 10:23:03.308984 4985 ???:1] "http: TLS handshake error from 192.168.126.11:35470: no serving certificate available for the kubelet" Feb 24 10:23:04 crc kubenswrapper[4985]: I0224 10:23:04.528053 4985 ???:1] "http: TLS handshake error from 192.168.126.11:35478: no serving certificate available for the kubelet" Feb 24 10:23:04 crc kubenswrapper[4985]: I0224 10:23:04.545493 4985 ???:1] "http: TLS handshake error from 192.168.126.11:35494: no serving certificate available for the kubelet" Feb 24 10:23:05 crc kubenswrapper[4985]: I0224 10:23:05.745150 4985 ???:1] "http: TLS handshake error from 192.168.126.11:42668: no serving certificate available for the kubelet" Feb 24 10:23:05 crc kubenswrapper[4985]: I0224 10:23:05.759820 4985 ???:1] "http: TLS handshake error from 192.168.126.11:42672: no serving certificate available for the kubelet" Feb 24 10:23:06 crc kubenswrapper[4985]: I0224 10:23:06.896621 4985 ???:1] "http: TLS handshake error from 192.168.126.11:42678: no serving certificate available for the kubelet" Feb 24 10:23:06 crc kubenswrapper[4985]: I0224 10:23:06.910063 4985 ???:1] "http: TLS handshake error from 192.168.126.11:42694: no serving certificate available for the kubelet" Feb 24 10:23:08 crc kubenswrapper[4985]: I0224 10:23:08.098137 4985 ???:1] "http: TLS handshake error from 192.168.126.11:42708: no serving certificate available for the kubelet" Feb 24 10:23:08 crc kubenswrapper[4985]: I0224 10:23:08.108099 4985 ???:1] "http: TLS handshake error from 192.168.126.11:42716: no serving certificate available for the kubelet" Feb 24 10:23:09 crc kubenswrapper[4985]: I0224 10:23:09.278652 4985 ???:1] "http: TLS handshake error from 192.168.126.11:42720: no serving certificate available for the kubelet" Feb 24 10:23:09 crc kubenswrapper[4985]: I0224 10:23:09.292389 4985 ???:1] "http: TLS handshake error from 192.168.126.11:42736: no serving certificate available for the kubelet" Feb 24 10:23:10 crc kubenswrapper[4985]: I0224 10:23:10.453972 4985 ???:1] "http: TLS handshake error from 192.168.126.11:42750: no serving certificate available for the kubelet" Feb 24 10:23:10 crc kubenswrapper[4985]: I0224 10:23:10.466722 4985 ???:1] "http: TLS handshake error from 192.168.126.11:42754: no serving certificate available for the kubelet" Feb 24 10:23:11 crc kubenswrapper[4985]: I0224 10:23:11.638603 4985 ???:1] "http: TLS handshake error from 192.168.126.11:42766: no serving certificate available for the kubelet" Feb 24 10:23:11 crc kubenswrapper[4985]: I0224 10:23:11.650509 4985 ???:1] "http: TLS handshake error from 192.168.126.11:42768: no serving certificate available for the kubelet" Feb 24 10:23:12 crc kubenswrapper[4985]: I0224 10:23:12.814244 4985 ???:1] "http: TLS handshake error from 192.168.126.11:42782: no serving certificate available for the kubelet" Feb 24 10:23:12 crc kubenswrapper[4985]: I0224 10:23:12.828328 4985 ???:1] "http: TLS handshake error from 192.168.126.11:42794: no serving certificate available for the kubelet" Feb 24 10:23:13 crc kubenswrapper[4985]: I0224 10:23:13.997434 4985 ???:1] "http: TLS handshake error from 192.168.126.11:42802: no serving certificate available for the kubelet" Feb 24 10:23:14 crc kubenswrapper[4985]: I0224 10:23:14.010019 4985 ???:1] "http: TLS handshake error from 192.168.126.11:42804: no serving certificate available for the kubelet" Feb 24 10:23:15 crc kubenswrapper[4985]: I0224 10:23:15.211653 4985 ???:1] "http: TLS handshake error from 192.168.126.11:40566: no serving certificate available for the kubelet" Feb 24 10:23:15 crc kubenswrapper[4985]: I0224 10:23:15.226314 4985 ???:1] "http: TLS handshake error from 192.168.126.11:40574: no serving certificate available for the kubelet" Feb 24 10:23:16 crc kubenswrapper[4985]: I0224 10:23:16.409633 4985 ???:1] "http: TLS handshake error from 192.168.126.11:40584: no serving certificate available for the kubelet" Feb 24 10:23:16 crc kubenswrapper[4985]: I0224 10:23:16.422211 4985 ???:1] "http: TLS handshake error from 192.168.126.11:40586: no serving certificate available for the kubelet" Feb 24 10:23:17 crc kubenswrapper[4985]: I0224 10:23:17.582776 4985 ???:1] "http: TLS handshake error from 192.168.126.11:40594: no serving certificate available for the kubelet" Feb 24 10:23:17 crc kubenswrapper[4985]: I0224 10:23:17.598187 4985 ???:1] "http: TLS handshake error from 192.168.126.11:40604: no serving certificate available for the kubelet" Feb 24 10:23:18 crc kubenswrapper[4985]: I0224 10:23:18.763949 4985 ???:1] "http: TLS handshake error from 192.168.126.11:40620: no serving certificate available for the kubelet" Feb 24 10:23:18 crc kubenswrapper[4985]: I0224 10:23:18.778776 4985 ???:1] "http: TLS handshake error from 192.168.126.11:40636: no serving certificate available for the kubelet" Feb 24 10:23:19 crc kubenswrapper[4985]: I0224 10:23:19.936081 4985 ???:1] "http: TLS handshake error from 192.168.126.11:40652: no serving certificate available for the kubelet" Feb 24 10:23:19 crc kubenswrapper[4985]: I0224 10:23:19.950638 4985 ???:1] "http: TLS handshake error from 192.168.126.11:40664: no serving certificate available for the kubelet" Feb 24 10:23:21 crc kubenswrapper[4985]: I0224 10:23:21.129425 4985 ???:1] "http: TLS handshake error from 192.168.126.11:40678: no serving certificate available for the kubelet" Feb 24 10:23:21 crc kubenswrapper[4985]: I0224 10:23:21.141877 4985 ???:1] "http: TLS handshake error from 192.168.126.11:40686: no serving certificate available for the kubelet" Feb 24 10:23:22 crc kubenswrapper[4985]: I0224 10:23:22.296541 4985 ???:1] "http: TLS handshake error from 192.168.126.11:40690: no serving certificate available for the kubelet" Feb 24 10:23:22 crc kubenswrapper[4985]: I0224 10:23:22.308810 4985 ???:1] "http: TLS handshake error from 192.168.126.11:40694: no serving certificate available for the kubelet" Feb 24 10:23:23 crc kubenswrapper[4985]: I0224 10:23:23.453495 4985 ???:1] "http: TLS handshake error from 192.168.126.11:40708: no serving certificate available for the kubelet" Feb 24 10:23:23 crc kubenswrapper[4985]: I0224 10:23:23.468777 4985 ???:1] "http: TLS handshake error from 192.168.126.11:40718: no serving certificate available for the kubelet" Feb 24 10:23:24 crc kubenswrapper[4985]: I0224 10:23:24.712210 4985 ???:1] "http: TLS handshake error from 192.168.126.11:43626: no serving certificate available for the kubelet" Feb 24 10:23:24 crc kubenswrapper[4985]: I0224 10:23:24.726216 4985 ???:1] "http: TLS handshake error from 192.168.126.11:43638: no serving certificate available for the kubelet" Feb 24 10:23:25 crc kubenswrapper[4985]: I0224 10:23:25.910687 4985 ???:1] "http: TLS handshake error from 192.168.126.11:43644: no serving certificate available for the kubelet" Feb 24 10:23:25 crc kubenswrapper[4985]: I0224 10:23:25.925747 4985 ???:1] "http: TLS handshake error from 192.168.126.11:43656: no serving certificate available for the kubelet" Feb 24 10:23:27 crc kubenswrapper[4985]: I0224 10:23:27.086415 4985 ???:1] "http: TLS handshake error from 192.168.126.11:43672: no serving certificate available for the kubelet" Feb 24 10:23:27 crc kubenswrapper[4985]: I0224 10:23:27.098421 4985 ???:1] "http: TLS handshake error from 192.168.126.11:43684: no serving certificate available for the kubelet" Feb 24 10:23:28 crc kubenswrapper[4985]: I0224 10:23:28.273648 4985 ???:1] "http: TLS handshake error from 192.168.126.11:43694: no serving certificate available for the kubelet" Feb 24 10:23:28 crc kubenswrapper[4985]: I0224 10:23:28.283948 4985 ???:1] "http: TLS handshake error from 192.168.126.11:43700: no serving certificate available for the kubelet" Feb 24 10:23:29 crc kubenswrapper[4985]: I0224 10:23:29.489073 4985 ???:1] "http: TLS handshake error from 192.168.126.11:43704: no serving certificate available for the kubelet" Feb 24 10:23:29 crc kubenswrapper[4985]: I0224 10:23:29.507715 4985 ???:1] "http: TLS handshake error from 192.168.126.11:43706: no serving certificate available for the kubelet" Feb 24 10:23:30 crc kubenswrapper[4985]: I0224 10:23:30.695804 4985 ???:1] "http: TLS handshake error from 192.168.126.11:43716: no serving certificate available for the kubelet" Feb 24 10:23:30 crc kubenswrapper[4985]: I0224 10:23:30.712061 4985 ???:1] "http: TLS handshake error from 192.168.126.11:43728: no serving certificate available for the kubelet" Feb 24 10:23:31 crc kubenswrapper[4985]: I0224 10:23:31.888288 4985 ???:1] "http: TLS handshake error from 192.168.126.11:43734: no serving certificate available for the kubelet" Feb 24 10:23:31 crc kubenswrapper[4985]: I0224 10:23:31.907630 4985 ???:1] "http: TLS handshake error from 192.168.126.11:43740: no serving certificate available for the kubelet" Feb 24 10:23:33 crc kubenswrapper[4985]: I0224 10:23:33.065109 4985 ???:1] "http: TLS handshake error from 192.168.126.11:43742: no serving certificate available for the kubelet" Feb 24 10:23:33 crc kubenswrapper[4985]: I0224 10:23:33.077517 4985 ???:1] "http: TLS handshake error from 192.168.126.11:43756: no serving certificate available for the kubelet" Feb 24 10:23:34 crc kubenswrapper[4985]: I0224 10:23:34.220284 4985 ???:1] "http: TLS handshake error from 192.168.126.11:43762: no serving certificate available for the kubelet" Feb 24 10:23:34 crc kubenswrapper[4985]: I0224 10:23:34.237287 4985 ???:1] "http: TLS handshake error from 192.168.126.11:43774: no serving certificate available for the kubelet" Feb 24 10:23:35 crc kubenswrapper[4985]: I0224 10:23:35.457798 4985 ???:1] "http: TLS handshake error from 192.168.126.11:46126: no serving certificate available for the kubelet" Feb 24 10:23:35 crc kubenswrapper[4985]: I0224 10:23:35.475576 4985 ???:1] "http: TLS handshake error from 192.168.126.11:46132: no serving certificate available for the kubelet" Feb 24 10:23:36 crc kubenswrapper[4985]: I0224 10:23:36.669804 4985 ???:1] "http: TLS handshake error from 192.168.126.11:46140: no serving certificate available for the kubelet" Feb 24 10:23:36 crc kubenswrapper[4985]: I0224 10:23:36.684147 4985 ???:1] "http: TLS handshake error from 192.168.126.11:46146: no serving certificate available for the kubelet" Feb 24 10:23:37 crc kubenswrapper[4985]: I0224 10:23:37.875306 4985 ???:1] "http: TLS handshake error from 192.168.126.11:46152: no serving certificate available for the kubelet" Feb 24 10:23:37 crc kubenswrapper[4985]: I0224 10:23:37.890592 4985 ???:1] "http: TLS handshake error from 192.168.126.11:46156: no serving certificate available for the kubelet" Feb 24 10:23:39 crc kubenswrapper[4985]: I0224 10:23:39.103405 4985 ???:1] "http: TLS handshake error from 192.168.126.11:46162: no serving certificate available for the kubelet" Feb 24 10:23:39 crc kubenswrapper[4985]: I0224 10:23:39.121351 4985 ???:1] "http: TLS handshake error from 192.168.126.11:46174: no serving certificate available for the kubelet" Feb 24 10:23:40 crc kubenswrapper[4985]: I0224 10:23:40.338244 4985 ???:1] "http: TLS handshake error from 192.168.126.11:46176: no serving certificate available for the kubelet" Feb 24 10:23:40 crc kubenswrapper[4985]: I0224 10:23:40.355293 4985 ???:1] "http: TLS handshake error from 192.168.126.11:46184: no serving certificate available for the kubelet" Feb 24 10:23:41 crc kubenswrapper[4985]: I0224 10:23:41.637505 4985 ???:1] "http: TLS handshake error from 192.168.126.11:46188: no serving certificate available for the kubelet" Feb 24 10:23:41 crc kubenswrapper[4985]: I0224 10:23:41.657091 4985 ???:1] "http: TLS handshake error from 192.168.126.11:46196: no serving certificate available for the kubelet" Feb 24 10:23:42 crc kubenswrapper[4985]: I0224 10:23:42.873010 4985 ???:1] "http: TLS handshake error from 192.168.126.11:46202: no serving certificate available for the kubelet" Feb 24 10:23:42 crc kubenswrapper[4985]: I0224 10:23:42.889585 4985 ???:1] "http: TLS handshake error from 192.168.126.11:46204: no serving certificate available for the kubelet" Feb 24 10:23:44 crc kubenswrapper[4985]: I0224 10:23:44.059162 4985 ???:1] "http: TLS handshake error from 192.168.126.11:46208: no serving certificate available for the kubelet" Feb 24 10:23:44 crc kubenswrapper[4985]: I0224 10:23:44.071456 4985 ???:1] "http: TLS handshake error from 192.168.126.11:46210: no serving certificate available for the kubelet" Feb 24 10:23:45 crc kubenswrapper[4985]: I0224 10:23:45.288828 4985 ???:1] "http: TLS handshake error from 192.168.126.11:41734: no serving certificate available for the kubelet" Feb 24 10:23:45 crc kubenswrapper[4985]: I0224 10:23:45.306618 4985 ???:1] "http: TLS handshake error from 192.168.126.11:41750: no serving certificate available for the kubelet" Feb 24 10:23:46 crc kubenswrapper[4985]: I0224 10:23:46.500678 4985 ???:1] "http: TLS handshake error from 192.168.126.11:41752: no serving certificate available for the kubelet" Feb 24 10:23:46 crc kubenswrapper[4985]: I0224 10:23:46.519628 4985 ???:1] "http: TLS handshake error from 192.168.126.11:41756: no serving certificate available for the kubelet" Feb 24 10:23:47 crc kubenswrapper[4985]: I0224 10:23:47.676469 4985 ???:1] "http: TLS handshake error from 192.168.126.11:41762: no serving certificate available for the kubelet" Feb 24 10:23:47 crc kubenswrapper[4985]: I0224 10:23:47.689485 4985 ???:1] "http: TLS handshake error from 192.168.126.11:41768: no serving certificate available for the kubelet" Feb 24 10:23:48 crc kubenswrapper[4985]: I0224 10:23:48.849290 4985 ???:1] "http: TLS handshake error from 192.168.126.11:41772: no serving certificate available for the kubelet" Feb 24 10:23:48 crc kubenswrapper[4985]: I0224 10:23:48.862579 4985 ???:1] "http: TLS handshake error from 192.168.126.11:41778: no serving certificate available for the kubelet" Feb 24 10:23:50 crc kubenswrapper[4985]: I0224 10:23:50.056133 4985 ???:1] "http: TLS handshake error from 192.168.126.11:41782: no serving certificate available for the kubelet" Feb 24 10:23:50 crc kubenswrapper[4985]: I0224 10:23:50.071077 4985 ???:1] "http: TLS handshake error from 192.168.126.11:41790: no serving certificate available for the kubelet" Feb 24 10:23:51 crc kubenswrapper[4985]: I0224 10:23:51.266835 4985 ???:1] "http: TLS handshake error from 192.168.126.11:41796: no serving certificate available for the kubelet" Feb 24 10:23:51 crc kubenswrapper[4985]: I0224 10:23:51.283644 4985 ???:1] "http: TLS handshake error from 192.168.126.11:41800: no serving certificate available for the kubelet" Feb 24 10:23:52 crc kubenswrapper[4985]: I0224 10:23:52.499243 4985 ???:1] "http: TLS handshake error from 192.168.126.11:41806: no serving certificate available for the kubelet" Feb 24 10:23:52 crc kubenswrapper[4985]: I0224 10:23:52.512681 4985 ???:1] "http: TLS handshake error from 192.168.126.11:41822: no serving certificate available for the kubelet" Feb 24 10:23:53 crc kubenswrapper[4985]: I0224 10:23:53.688567 4985 ???:1] "http: TLS handshake error from 192.168.126.11:41830: no serving certificate available for the kubelet" Feb 24 10:23:53 crc kubenswrapper[4985]: I0224 10:23:53.703799 4985 ???:1] "http: TLS handshake error from 192.168.126.11:41840: no serving certificate available for the kubelet" Feb 24 10:23:54 crc kubenswrapper[4985]: I0224 10:23:54.921236 4985 ???:1] "http: TLS handshake error from 192.168.126.11:60560: no serving certificate available for the kubelet" Feb 24 10:23:54 crc kubenswrapper[4985]: I0224 10:23:54.935463 4985 ???:1] "http: TLS handshake error from 192.168.126.11:60568: no serving certificate available for the kubelet" Feb 24 10:23:56 crc kubenswrapper[4985]: I0224 10:23:56.098396 4985 ???:1] "http: TLS handshake error from 192.168.126.11:60584: no serving certificate available for the kubelet" Feb 24 10:23:56 crc kubenswrapper[4985]: I0224 10:23:56.111515 4985 ???:1] "http: TLS handshake error from 192.168.126.11:60590: no serving certificate available for the kubelet" Feb 24 10:23:57 crc kubenswrapper[4985]: I0224 10:23:57.284532 4985 ???:1] "http: TLS handshake error from 192.168.126.11:60594: no serving certificate available for the kubelet" Feb 24 10:23:57 crc kubenswrapper[4985]: I0224 10:23:57.299628 4985 ???:1] "http: TLS handshake error from 192.168.126.11:60606: no serving certificate available for the kubelet" Feb 24 10:23:58 crc kubenswrapper[4985]: I0224 10:23:58.438548 4985 ???:1] "http: TLS handshake error from 192.168.126.11:60608: no serving certificate available for the kubelet" Feb 24 10:23:58 crc kubenswrapper[4985]: I0224 10:23:58.453978 4985 ???:1] "http: TLS handshake error from 192.168.126.11:60624: no serving certificate available for the kubelet" Feb 24 10:23:59 crc kubenswrapper[4985]: I0224 10:23:59.610009 4985 ???:1] "http: TLS handshake error from 192.168.126.11:60630: no serving certificate available for the kubelet" Feb 24 10:23:59 crc kubenswrapper[4985]: I0224 10:23:59.625049 4985 ???:1] "http: TLS handshake error from 192.168.126.11:60632: no serving certificate available for the kubelet" Feb 24 10:24:00 crc kubenswrapper[4985]: I0224 10:24:00.836141 4985 ???:1] "http: TLS handshake error from 192.168.126.11:60644: no serving certificate available for the kubelet" Feb 24 10:24:00 crc kubenswrapper[4985]: I0224 10:24:00.852673 4985 ???:1] "http: TLS handshake error from 192.168.126.11:60652: no serving certificate available for the kubelet" Feb 24 10:24:01 crc kubenswrapper[4985]: I0224 10:24:01.991222 4985 ???:1] "http: TLS handshake error from 192.168.126.11:60658: no serving certificate available for the kubelet" Feb 24 10:24:02 crc kubenswrapper[4985]: I0224 10:24:02.007323 4985 ???:1] "http: TLS handshake error from 192.168.126.11:60674: no serving certificate available for the kubelet" Feb 24 10:24:03 crc kubenswrapper[4985]: I0224 10:24:03.186487 4985 ???:1] "http: TLS handshake error from 192.168.126.11:60676: no serving certificate available for the kubelet" Feb 24 10:24:03 crc kubenswrapper[4985]: I0224 10:24:03.204764 4985 ???:1] "http: TLS handshake error from 192.168.126.11:60684: no serving certificate available for the kubelet" Feb 24 10:24:04 crc kubenswrapper[4985]: I0224 10:24:04.429656 4985 ???:1] "http: TLS handshake error from 192.168.126.11:60694: no serving certificate available for the kubelet" Feb 24 10:24:04 crc kubenswrapper[4985]: I0224 10:24:04.443712 4985 ???:1] "http: TLS handshake error from 192.168.126.11:60708: no serving certificate available for the kubelet" Feb 24 10:24:05 crc kubenswrapper[4985]: I0224 10:24:05.692769 4985 ???:1] "http: TLS handshake error from 192.168.126.11:56888: no serving certificate available for the kubelet" Feb 24 10:24:05 crc kubenswrapper[4985]: I0224 10:24:05.707000 4985 ???:1] "http: TLS handshake error from 192.168.126.11:56894: no serving certificate available for the kubelet" Feb 24 10:24:06 crc kubenswrapper[4985]: I0224 10:24:06.887087 4985 ???:1] "http: TLS handshake error from 192.168.126.11:56910: no serving certificate available for the kubelet" Feb 24 10:24:06 crc kubenswrapper[4985]: I0224 10:24:06.903958 4985 ???:1] "http: TLS handshake error from 192.168.126.11:56914: no serving certificate available for the kubelet" Feb 24 10:24:08 crc kubenswrapper[4985]: I0224 10:24:08.056715 4985 ???:1] "http: TLS handshake error from 192.168.126.11:56916: no serving certificate available for the kubelet" Feb 24 10:24:08 crc kubenswrapper[4985]: I0224 10:24:08.074026 4985 ???:1] "http: TLS handshake error from 192.168.126.11:56922: no serving certificate available for the kubelet" Feb 24 10:24:09 crc kubenswrapper[4985]: I0224 10:24:09.299303 4985 ???:1] "http: TLS handshake error from 192.168.126.11:56928: no serving certificate available for the kubelet" Feb 24 10:24:09 crc kubenswrapper[4985]: I0224 10:24:09.316121 4985 ???:1] "http: TLS handshake error from 192.168.126.11:56936: no serving certificate available for the kubelet" Feb 24 10:24:10 crc kubenswrapper[4985]: I0224 10:24:10.488729 4985 ???:1] "http: TLS handshake error from 192.168.126.11:56938: no serving certificate available for the kubelet" Feb 24 10:24:10 crc kubenswrapper[4985]: I0224 10:24:10.507697 4985 ???:1] "http: TLS handshake error from 192.168.126.11:56950: no serving certificate available for the kubelet" Feb 24 10:24:11 crc kubenswrapper[4985]: I0224 10:24:11.693380 4985 ???:1] "http: TLS handshake error from 192.168.126.11:56956: no serving certificate available for the kubelet" Feb 24 10:24:11 crc kubenswrapper[4985]: I0224 10:24:11.712040 4985 ???:1] "http: TLS handshake error from 192.168.126.11:56972: no serving certificate available for the kubelet" Feb 24 10:24:12 crc kubenswrapper[4985]: I0224 10:24:12.953306 4985 ???:1] "http: TLS handshake error from 192.168.126.11:56982: no serving certificate available for the kubelet" Feb 24 10:24:12 crc kubenswrapper[4985]: I0224 10:24:12.969604 4985 ???:1] "http: TLS handshake error from 192.168.126.11:56998: no serving certificate available for the kubelet" Feb 24 10:24:14 crc kubenswrapper[4985]: I0224 10:24:14.120828 4985 ???:1] "http: TLS handshake error from 192.168.126.11:57010: no serving certificate available for the kubelet" Feb 24 10:24:14 crc kubenswrapper[4985]: I0224 10:24:14.138723 4985 ???:1] "http: TLS handshake error from 192.168.126.11:57016: no serving certificate available for the kubelet" Feb 24 10:24:15 crc kubenswrapper[4985]: I0224 10:24:15.322688 4985 ???:1] "http: TLS handshake error from 192.168.126.11:53990: no serving certificate available for the kubelet" Feb 24 10:24:15 crc kubenswrapper[4985]: I0224 10:24:15.335630 4985 ???:1] "http: TLS handshake error from 192.168.126.11:54004: no serving certificate available for the kubelet" Feb 24 10:24:16 crc kubenswrapper[4985]: I0224 10:24:16.556108 4985 ???:1] "http: TLS handshake error from 192.168.126.11:54006: no serving certificate available for the kubelet" Feb 24 10:24:16 crc kubenswrapper[4985]: I0224 10:24:16.576325 4985 ???:1] "http: TLS handshake error from 192.168.126.11:54010: no serving certificate available for the kubelet" Feb 24 10:24:17 crc kubenswrapper[4985]: I0224 10:24:17.802658 4985 ???:1] "http: TLS handshake error from 192.168.126.11:54024: no serving certificate available for the kubelet" Feb 24 10:24:17 crc kubenswrapper[4985]: I0224 10:24:17.819327 4985 ???:1] "http: TLS handshake error from 192.168.126.11:54038: no serving certificate available for the kubelet" Feb 24 10:24:19 crc kubenswrapper[4985]: I0224 10:24:19.000424 4985 ???:1] "http: TLS handshake error from 192.168.126.11:54044: no serving certificate available for the kubelet" Feb 24 10:24:19 crc kubenswrapper[4985]: I0224 10:24:19.020977 4985 ???:1] "http: TLS handshake error from 192.168.126.11:54050: no serving certificate available for the kubelet" Feb 24 10:24:20 crc kubenswrapper[4985]: I0224 10:24:20.234599 4985 ???:1] "http: TLS handshake error from 192.168.126.11:54052: no serving certificate available for the kubelet" Feb 24 10:24:20 crc kubenswrapper[4985]: I0224 10:24:20.247844 4985 ???:1] "http: TLS handshake error from 192.168.126.11:54054: no serving certificate available for the kubelet" Feb 24 10:24:21 crc kubenswrapper[4985]: I0224 10:24:21.452034 4985 ???:1] "http: TLS handshake error from 192.168.126.11:54068: no serving certificate available for the kubelet" Feb 24 10:24:21 crc kubenswrapper[4985]: I0224 10:24:21.470391 4985 ???:1] "http: TLS handshake error from 192.168.126.11:54082: no serving certificate available for the kubelet" Feb 24 10:24:22 crc kubenswrapper[4985]: I0224 10:24:22.684242 4985 ???:1] "http: TLS handshake error from 192.168.126.11:54094: no serving certificate available for the kubelet" Feb 24 10:24:22 crc kubenswrapper[4985]: I0224 10:24:22.699444 4985 ???:1] "http: TLS handshake error from 192.168.126.11:54096: no serving certificate available for the kubelet" Feb 24 10:24:25 crc kubenswrapper[4985]: I0224 10:24:25.860432 4985 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-gsnlw"] Feb 24 10:24:25 crc kubenswrapper[4985]: I0224 10:24:25.862641 4985 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-gsnlw" Feb 24 10:24:25 crc kubenswrapper[4985]: I0224 10:24:25.883865 4985 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-gsnlw"] Feb 24 10:24:25 crc kubenswrapper[4985]: I0224 10:24:25.931729 4985 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2c828eeb-e3d6-4a60-80cb-ef5bcc09e18e-utilities\") pod \"certified-operators-gsnlw\" (UID: \"2c828eeb-e3d6-4a60-80cb-ef5bcc09e18e\") " pod="openshift-marketplace/certified-operators-gsnlw" Feb 24 10:24:25 crc kubenswrapper[4985]: I0224 10:24:25.931865 4985 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2c828eeb-e3d6-4a60-80cb-ef5bcc09e18e-catalog-content\") pod \"certified-operators-gsnlw\" (UID: \"2c828eeb-e3d6-4a60-80cb-ef5bcc09e18e\") " pod="openshift-marketplace/certified-operators-gsnlw" Feb 24 10:24:25 crc kubenswrapper[4985]: I0224 10:24:25.931929 4985 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qncr4\" (UniqueName: \"kubernetes.io/projected/2c828eeb-e3d6-4a60-80cb-ef5bcc09e18e-kube-api-access-qncr4\") pod \"certified-operators-gsnlw\" (UID: \"2c828eeb-e3d6-4a60-80cb-ef5bcc09e18e\") " pod="openshift-marketplace/certified-operators-gsnlw" Feb 24 10:24:26 crc kubenswrapper[4985]: I0224 10:24:26.032846 4985 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2c828eeb-e3d6-4a60-80cb-ef5bcc09e18e-catalog-content\") pod \"certified-operators-gsnlw\" (UID: \"2c828eeb-e3d6-4a60-80cb-ef5bcc09e18e\") " pod="openshift-marketplace/certified-operators-gsnlw" Feb 24 10:24:26 crc kubenswrapper[4985]: I0224 10:24:26.032915 4985 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qncr4\" (UniqueName: \"kubernetes.io/projected/2c828eeb-e3d6-4a60-80cb-ef5bcc09e18e-kube-api-access-qncr4\") pod \"certified-operators-gsnlw\" (UID: \"2c828eeb-e3d6-4a60-80cb-ef5bcc09e18e\") " pod="openshift-marketplace/certified-operators-gsnlw" Feb 24 10:24:26 crc kubenswrapper[4985]: I0224 10:24:26.032941 4985 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2c828eeb-e3d6-4a60-80cb-ef5bcc09e18e-utilities\") pod \"certified-operators-gsnlw\" (UID: \"2c828eeb-e3d6-4a60-80cb-ef5bcc09e18e\") " pod="openshift-marketplace/certified-operators-gsnlw" Feb 24 10:24:26 crc kubenswrapper[4985]: I0224 10:24:26.033403 4985 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2c828eeb-e3d6-4a60-80cb-ef5bcc09e18e-utilities\") pod \"certified-operators-gsnlw\" (UID: \"2c828eeb-e3d6-4a60-80cb-ef5bcc09e18e\") " pod="openshift-marketplace/certified-operators-gsnlw" Feb 24 10:24:26 crc kubenswrapper[4985]: I0224 10:24:26.033748 4985 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2c828eeb-e3d6-4a60-80cb-ef5bcc09e18e-catalog-content\") pod \"certified-operators-gsnlw\" (UID: \"2c828eeb-e3d6-4a60-80cb-ef5bcc09e18e\") " pod="openshift-marketplace/certified-operators-gsnlw" Feb 24 10:24:26 crc kubenswrapper[4985]: I0224 10:24:26.071505 4985 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qncr4\" (UniqueName: \"kubernetes.io/projected/2c828eeb-e3d6-4a60-80cb-ef5bcc09e18e-kube-api-access-qncr4\") pod \"certified-operators-gsnlw\" (UID: \"2c828eeb-e3d6-4a60-80cb-ef5bcc09e18e\") " pod="openshift-marketplace/certified-operators-gsnlw" Feb 24 10:24:26 crc kubenswrapper[4985]: I0224 10:24:26.194745 4985 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-gsnlw" Feb 24 10:24:26 crc kubenswrapper[4985]: I0224 10:24:26.485266 4985 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-gsnlw"] Feb 24 10:24:27 crc kubenswrapper[4985]: I0224 10:24:27.452305 4985 generic.go:334] "Generic (PLEG): container finished" podID="2c828eeb-e3d6-4a60-80cb-ef5bcc09e18e" containerID="7bfed95cdea4931d7515ee01f6cfd392c6d3efce7482fc56a3a3c4c4de771d88" exitCode=0 Feb 24 10:24:27 crc kubenswrapper[4985]: I0224 10:24:27.452359 4985 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gsnlw" event={"ID":"2c828eeb-e3d6-4a60-80cb-ef5bcc09e18e","Type":"ContainerDied","Data":"7bfed95cdea4931d7515ee01f6cfd392c6d3efce7482fc56a3a3c4c4de771d88"} Feb 24 10:24:27 crc kubenswrapper[4985]: I0224 10:24:27.452659 4985 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gsnlw" event={"ID":"2c828eeb-e3d6-4a60-80cb-ef5bcc09e18e","Type":"ContainerStarted","Data":"3d571ea1d800e41a4f7d0cac2769b92983de664ee5a6e3a67bcfe01315954c67"} Feb 24 10:24:28 crc kubenswrapper[4985]: I0224 10:24:28.461320 4985 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gsnlw" event={"ID":"2c828eeb-e3d6-4a60-80cb-ef5bcc09e18e","Type":"ContainerStarted","Data":"1870c7f0def7cfdee66679171e863da208e2bf0ec01246d00338afcde98234a4"} Feb 24 10:24:29 crc kubenswrapper[4985]: I0224 10:24:29.469773 4985 generic.go:334] "Generic (PLEG): container finished" podID="2c828eeb-e3d6-4a60-80cb-ef5bcc09e18e" containerID="1870c7f0def7cfdee66679171e863da208e2bf0ec01246d00338afcde98234a4" exitCode=0 Feb 24 10:24:29 crc kubenswrapper[4985]: I0224 10:24:29.469823 4985 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gsnlw" event={"ID":"2c828eeb-e3d6-4a60-80cb-ef5bcc09e18e","Type":"ContainerDied","Data":"1870c7f0def7cfdee66679171e863da208e2bf0ec01246d00338afcde98234a4"} Feb 24 10:24:30 crc kubenswrapper[4985]: I0224 10:24:30.478553 4985 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gsnlw" event={"ID":"2c828eeb-e3d6-4a60-80cb-ef5bcc09e18e","Type":"ContainerStarted","Data":"03757bb29250b49ac51ef822e00a229dca639cf26ebf7aada314e1929a1ed9ee"} Feb 24 10:24:30 crc kubenswrapper[4985]: I0224 10:24:30.498034 4985 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-gsnlw" podStartSLOduration=3.00507973 podStartE2EDuration="5.498012908s" podCreationTimestamp="2026-02-24 10:24:25 +0000 UTC" firstStartedPulling="2026-02-24 10:24:27.454439379 +0000 UTC m=+951.928663620" lastFinishedPulling="2026-02-24 10:24:29.947404228 +0000 UTC m=+954.421596798" observedRunningTime="2026-02-24 10:24:30.495426847 +0000 UTC m=+954.969619447" watchObservedRunningTime="2026-02-24 10:24:30.498012908 +0000 UTC m=+954.972205488" Feb 24 10:24:36 crc kubenswrapper[4985]: I0224 10:24:36.195578 4985 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-gsnlw" Feb 24 10:24:36 crc kubenswrapper[4985]: I0224 10:24:36.196445 4985 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-gsnlw" Feb 24 10:24:36 crc kubenswrapper[4985]: I0224 10:24:36.236653 4985 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-gsnlw" Feb 24 10:24:36 crc kubenswrapper[4985]: I0224 10:24:36.600827 4985 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-gsnlw" Feb 24 10:24:36 crc kubenswrapper[4985]: I0224 10:24:36.638138 4985 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-gsnlw"] Feb 24 10:24:38 crc kubenswrapper[4985]: I0224 10:24:38.549309 4985 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-gsnlw" podUID="2c828eeb-e3d6-4a60-80cb-ef5bcc09e18e" containerName="registry-server" containerID="cri-o://03757bb29250b49ac51ef822e00a229dca639cf26ebf7aada314e1929a1ed9ee" gracePeriod=2 Feb 24 10:24:38 crc kubenswrapper[4985]: I0224 10:24:38.966074 4985 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-gsnlw" Feb 24 10:24:39 crc kubenswrapper[4985]: I0224 10:24:39.043861 4985 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2c828eeb-e3d6-4a60-80cb-ef5bcc09e18e-utilities\") pod \"2c828eeb-e3d6-4a60-80cb-ef5bcc09e18e\" (UID: \"2c828eeb-e3d6-4a60-80cb-ef5bcc09e18e\") " Feb 24 10:24:39 crc kubenswrapper[4985]: I0224 10:24:39.043930 4985 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qncr4\" (UniqueName: \"kubernetes.io/projected/2c828eeb-e3d6-4a60-80cb-ef5bcc09e18e-kube-api-access-qncr4\") pod \"2c828eeb-e3d6-4a60-80cb-ef5bcc09e18e\" (UID: \"2c828eeb-e3d6-4a60-80cb-ef5bcc09e18e\") " Feb 24 10:24:39 crc kubenswrapper[4985]: I0224 10:24:39.044006 4985 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2c828eeb-e3d6-4a60-80cb-ef5bcc09e18e-catalog-content\") pod \"2c828eeb-e3d6-4a60-80cb-ef5bcc09e18e\" (UID: \"2c828eeb-e3d6-4a60-80cb-ef5bcc09e18e\") " Feb 24 10:24:39 crc kubenswrapper[4985]: I0224 10:24:39.046830 4985 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2c828eeb-e3d6-4a60-80cb-ef5bcc09e18e-utilities" (OuterVolumeSpecName: "utilities") pod "2c828eeb-e3d6-4a60-80cb-ef5bcc09e18e" (UID: "2c828eeb-e3d6-4a60-80cb-ef5bcc09e18e"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 24 10:24:39 crc kubenswrapper[4985]: I0224 10:24:39.052036 4985 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2c828eeb-e3d6-4a60-80cb-ef5bcc09e18e-kube-api-access-qncr4" (OuterVolumeSpecName: "kube-api-access-qncr4") pod "2c828eeb-e3d6-4a60-80cb-ef5bcc09e18e" (UID: "2c828eeb-e3d6-4a60-80cb-ef5bcc09e18e"). InnerVolumeSpecName "kube-api-access-qncr4". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 24 10:24:39 crc kubenswrapper[4985]: I0224 10:24:39.145916 4985 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qncr4\" (UniqueName: \"kubernetes.io/projected/2c828eeb-e3d6-4a60-80cb-ef5bcc09e18e-kube-api-access-qncr4\") on node \"crc\" DevicePath \"\"" Feb 24 10:24:39 crc kubenswrapper[4985]: I0224 10:24:39.145999 4985 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2c828eeb-e3d6-4a60-80cb-ef5bcc09e18e-utilities\") on node \"crc\" DevicePath \"\"" Feb 24 10:24:39 crc kubenswrapper[4985]: I0224 10:24:39.559055 4985 generic.go:334] "Generic (PLEG): container finished" podID="2c828eeb-e3d6-4a60-80cb-ef5bcc09e18e" containerID="03757bb29250b49ac51ef822e00a229dca639cf26ebf7aada314e1929a1ed9ee" exitCode=0 Feb 24 10:24:39 crc kubenswrapper[4985]: I0224 10:24:39.559113 4985 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gsnlw" event={"ID":"2c828eeb-e3d6-4a60-80cb-ef5bcc09e18e","Type":"ContainerDied","Data":"03757bb29250b49ac51ef822e00a229dca639cf26ebf7aada314e1929a1ed9ee"} Feb 24 10:24:39 crc kubenswrapper[4985]: I0224 10:24:39.559147 4985 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gsnlw" event={"ID":"2c828eeb-e3d6-4a60-80cb-ef5bcc09e18e","Type":"ContainerDied","Data":"3d571ea1d800e41a4f7d0cac2769b92983de664ee5a6e3a67bcfe01315954c67"} Feb 24 10:24:39 crc kubenswrapper[4985]: I0224 10:24:39.559174 4985 scope.go:117] "RemoveContainer" containerID="03757bb29250b49ac51ef822e00a229dca639cf26ebf7aada314e1929a1ed9ee" Feb 24 10:24:39 crc kubenswrapper[4985]: I0224 10:24:39.559323 4985 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-gsnlw" Feb 24 10:24:39 crc kubenswrapper[4985]: I0224 10:24:39.588385 4985 scope.go:117] "RemoveContainer" containerID="1870c7f0def7cfdee66679171e863da208e2bf0ec01246d00338afcde98234a4" Feb 24 10:24:39 crc kubenswrapper[4985]: I0224 10:24:39.616588 4985 scope.go:117] "RemoveContainer" containerID="7bfed95cdea4931d7515ee01f6cfd392c6d3efce7482fc56a3a3c4c4de771d88" Feb 24 10:24:39 crc kubenswrapper[4985]: I0224 10:24:39.644579 4985 scope.go:117] "RemoveContainer" containerID="03757bb29250b49ac51ef822e00a229dca639cf26ebf7aada314e1929a1ed9ee" Feb 24 10:24:39 crc kubenswrapper[4985]: E0224 10:24:39.645249 4985 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"03757bb29250b49ac51ef822e00a229dca639cf26ebf7aada314e1929a1ed9ee\": container with ID starting with 03757bb29250b49ac51ef822e00a229dca639cf26ebf7aada314e1929a1ed9ee not found: ID does not exist" containerID="03757bb29250b49ac51ef822e00a229dca639cf26ebf7aada314e1929a1ed9ee" Feb 24 10:24:39 crc kubenswrapper[4985]: I0224 10:24:39.645305 4985 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"03757bb29250b49ac51ef822e00a229dca639cf26ebf7aada314e1929a1ed9ee"} err="failed to get container status \"03757bb29250b49ac51ef822e00a229dca639cf26ebf7aada314e1929a1ed9ee\": rpc error: code = NotFound desc = could not find container \"03757bb29250b49ac51ef822e00a229dca639cf26ebf7aada314e1929a1ed9ee\": container with ID starting with 03757bb29250b49ac51ef822e00a229dca639cf26ebf7aada314e1929a1ed9ee not found: ID does not exist" Feb 24 10:24:39 crc kubenswrapper[4985]: I0224 10:24:39.645344 4985 scope.go:117] "RemoveContainer" containerID="1870c7f0def7cfdee66679171e863da208e2bf0ec01246d00338afcde98234a4" Feb 24 10:24:39 crc kubenswrapper[4985]: E0224 10:24:39.646137 4985 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1870c7f0def7cfdee66679171e863da208e2bf0ec01246d00338afcde98234a4\": container with ID starting with 1870c7f0def7cfdee66679171e863da208e2bf0ec01246d00338afcde98234a4 not found: ID does not exist" containerID="1870c7f0def7cfdee66679171e863da208e2bf0ec01246d00338afcde98234a4" Feb 24 10:24:39 crc kubenswrapper[4985]: I0224 10:24:39.646277 4985 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1870c7f0def7cfdee66679171e863da208e2bf0ec01246d00338afcde98234a4"} err="failed to get container status \"1870c7f0def7cfdee66679171e863da208e2bf0ec01246d00338afcde98234a4\": rpc error: code = NotFound desc = could not find container \"1870c7f0def7cfdee66679171e863da208e2bf0ec01246d00338afcde98234a4\": container with ID starting with 1870c7f0def7cfdee66679171e863da208e2bf0ec01246d00338afcde98234a4 not found: ID does not exist" Feb 24 10:24:39 crc kubenswrapper[4985]: I0224 10:24:39.646304 4985 scope.go:117] "RemoveContainer" containerID="7bfed95cdea4931d7515ee01f6cfd392c6d3efce7482fc56a3a3c4c4de771d88" Feb 24 10:24:39 crc kubenswrapper[4985]: E0224 10:24:39.646878 4985 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7bfed95cdea4931d7515ee01f6cfd392c6d3efce7482fc56a3a3c4c4de771d88\": container with ID starting with 7bfed95cdea4931d7515ee01f6cfd392c6d3efce7482fc56a3a3c4c4de771d88 not found: ID does not exist" containerID="7bfed95cdea4931d7515ee01f6cfd392c6d3efce7482fc56a3a3c4c4de771d88" Feb 24 10:24:39 crc kubenswrapper[4985]: I0224 10:24:39.646970 4985 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7bfed95cdea4931d7515ee01f6cfd392c6d3efce7482fc56a3a3c4c4de771d88"} err="failed to get container status \"7bfed95cdea4931d7515ee01f6cfd392c6d3efce7482fc56a3a3c4c4de771d88\": rpc error: code = NotFound desc = could not find container \"7bfed95cdea4931d7515ee01f6cfd392c6d3efce7482fc56a3a3c4c4de771d88\": container with ID starting with 7bfed95cdea4931d7515ee01f6cfd392c6d3efce7482fc56a3a3c4c4de771d88 not found: ID does not exist" Feb 24 10:24:40 crc kubenswrapper[4985]: I0224 10:24:40.172283 4985 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2c828eeb-e3d6-4a60-80cb-ef5bcc09e18e-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "2c828eeb-e3d6-4a60-80cb-ef5bcc09e18e" (UID: "2c828eeb-e3d6-4a60-80cb-ef5bcc09e18e"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 24 10:24:40 crc kubenswrapper[4985]: I0224 10:24:40.266357 4985 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2c828eeb-e3d6-4a60-80cb-ef5bcc09e18e-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 24 10:24:40 crc kubenswrapper[4985]: I0224 10:24:40.486020 4985 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-gsnlw"] Feb 24 10:24:40 crc kubenswrapper[4985]: I0224 10:24:40.490107 4985 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-gsnlw"] Feb 24 10:24:42 crc kubenswrapper[4985]: I0224 10:24:42.273661 4985 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2c828eeb-e3d6-4a60-80cb-ef5bcc09e18e" path="/var/lib/kubelet/pods/2c828eeb-e3d6-4a60-80cb-ef5bcc09e18e/volumes" Feb 24 10:24:43 crc kubenswrapper[4985]: I0224 10:24:43.625156 4985 patch_prober.go:28] interesting pod/machine-config-daemon-hq52w container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 24 10:24:43 crc kubenswrapper[4985]: I0224 10:24:43.625236 4985 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hq52w" podUID="11c1c7b8-18df-4583-849f-76b62544344b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 24 10:25:04 crc kubenswrapper[4985]: I0224 10:25:04.651678 4985 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-jxp4q/must-gather-vsghw"] Feb 24 10:25:04 crc kubenswrapper[4985]: E0224 10:25:04.652384 4985 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2c828eeb-e3d6-4a60-80cb-ef5bcc09e18e" containerName="extract-content" Feb 24 10:25:04 crc kubenswrapper[4985]: I0224 10:25:04.652395 4985 state_mem.go:107] "Deleted CPUSet assignment" podUID="2c828eeb-e3d6-4a60-80cb-ef5bcc09e18e" containerName="extract-content" Feb 24 10:25:04 crc kubenswrapper[4985]: E0224 10:25:04.652405 4985 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2c828eeb-e3d6-4a60-80cb-ef5bcc09e18e" containerName="registry-server" Feb 24 10:25:04 crc kubenswrapper[4985]: I0224 10:25:04.652412 4985 state_mem.go:107] "Deleted CPUSet assignment" podUID="2c828eeb-e3d6-4a60-80cb-ef5bcc09e18e" containerName="registry-server" Feb 24 10:25:04 crc kubenswrapper[4985]: E0224 10:25:04.652419 4985 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2c828eeb-e3d6-4a60-80cb-ef5bcc09e18e" containerName="extract-utilities" Feb 24 10:25:04 crc kubenswrapper[4985]: I0224 10:25:04.652425 4985 state_mem.go:107] "Deleted CPUSet assignment" podUID="2c828eeb-e3d6-4a60-80cb-ef5bcc09e18e" containerName="extract-utilities" Feb 24 10:25:04 crc kubenswrapper[4985]: I0224 10:25:04.652524 4985 memory_manager.go:354] "RemoveStaleState removing state" podUID="2c828eeb-e3d6-4a60-80cb-ef5bcc09e18e" containerName="registry-server" Feb 24 10:25:04 crc kubenswrapper[4985]: I0224 10:25:04.653052 4985 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-jxp4q/must-gather-vsghw" Feb 24 10:25:04 crc kubenswrapper[4985]: I0224 10:25:04.655133 4985 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-jxp4q"/"kube-root-ca.crt" Feb 24 10:25:04 crc kubenswrapper[4985]: I0224 10:25:04.655190 4985 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-jxp4q"/"default-dockercfg-smzgb" Feb 24 10:25:04 crc kubenswrapper[4985]: I0224 10:25:04.655780 4985 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-jxp4q"/"openshift-service-ca.crt" Feb 24 10:25:04 crc kubenswrapper[4985]: I0224 10:25:04.699870 4985 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-jxp4q/must-gather-vsghw"] Feb 24 10:25:04 crc kubenswrapper[4985]: I0224 10:25:04.796433 4985 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4wrqq\" (UniqueName: \"kubernetes.io/projected/46643c34-b364-4b70-b54d-32ff88076ca8-kube-api-access-4wrqq\") pod \"must-gather-vsghw\" (UID: \"46643c34-b364-4b70-b54d-32ff88076ca8\") " pod="openshift-must-gather-jxp4q/must-gather-vsghw" Feb 24 10:25:04 crc kubenswrapper[4985]: I0224 10:25:04.796490 4985 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/46643c34-b364-4b70-b54d-32ff88076ca8-must-gather-output\") pod \"must-gather-vsghw\" (UID: \"46643c34-b364-4b70-b54d-32ff88076ca8\") " pod="openshift-must-gather-jxp4q/must-gather-vsghw" Feb 24 10:25:04 crc kubenswrapper[4985]: I0224 10:25:04.897599 4985 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4wrqq\" (UniqueName: \"kubernetes.io/projected/46643c34-b364-4b70-b54d-32ff88076ca8-kube-api-access-4wrqq\") pod \"must-gather-vsghw\" (UID: \"46643c34-b364-4b70-b54d-32ff88076ca8\") " pod="openshift-must-gather-jxp4q/must-gather-vsghw" Feb 24 10:25:04 crc kubenswrapper[4985]: I0224 10:25:04.897740 4985 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/46643c34-b364-4b70-b54d-32ff88076ca8-must-gather-output\") pod \"must-gather-vsghw\" (UID: \"46643c34-b364-4b70-b54d-32ff88076ca8\") " pod="openshift-must-gather-jxp4q/must-gather-vsghw" Feb 24 10:25:04 crc kubenswrapper[4985]: I0224 10:25:04.898220 4985 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/46643c34-b364-4b70-b54d-32ff88076ca8-must-gather-output\") pod \"must-gather-vsghw\" (UID: \"46643c34-b364-4b70-b54d-32ff88076ca8\") " pod="openshift-must-gather-jxp4q/must-gather-vsghw" Feb 24 10:25:04 crc kubenswrapper[4985]: I0224 10:25:04.918143 4985 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4wrqq\" (UniqueName: \"kubernetes.io/projected/46643c34-b364-4b70-b54d-32ff88076ca8-kube-api-access-4wrqq\") pod \"must-gather-vsghw\" (UID: \"46643c34-b364-4b70-b54d-32ff88076ca8\") " pod="openshift-must-gather-jxp4q/must-gather-vsghw" Feb 24 10:25:04 crc kubenswrapper[4985]: I0224 10:25:04.980404 4985 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-jxp4q/must-gather-vsghw" Feb 24 10:25:05 crc kubenswrapper[4985]: I0224 10:25:05.158583 4985 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-jxp4q/must-gather-vsghw"] Feb 24 10:25:05 crc kubenswrapper[4985]: I0224 10:25:05.735502 4985 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-jxp4q/must-gather-vsghw" event={"ID":"46643c34-b364-4b70-b54d-32ff88076ca8","Type":"ContainerStarted","Data":"489e9a70ed9075bf138d2f338403341007aef2b41035715001380b8b2a1661dc"} Feb 24 10:25:11 crc kubenswrapper[4985]: I0224 10:25:11.772561 4985 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-jxp4q/must-gather-vsghw" event={"ID":"46643c34-b364-4b70-b54d-32ff88076ca8","Type":"ContainerStarted","Data":"49a3cc684519663176a07abd40ca2c07b1cd74e934e04fc4f899f6c07451a09c"} Feb 24 10:25:11 crc kubenswrapper[4985]: I0224 10:25:11.772812 4985 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-jxp4q/must-gather-vsghw" event={"ID":"46643c34-b364-4b70-b54d-32ff88076ca8","Type":"ContainerStarted","Data":"ab16ed95cc5f093c1f1be8e501b16fe89817fc1c16e4857c4f3e0c4cf135941f"} Feb 24 10:25:11 crc kubenswrapper[4985]: I0224 10:25:11.791120 4985 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-jxp4q/must-gather-vsghw" podStartSLOduration=1.782201025 podStartE2EDuration="7.791104051s" podCreationTimestamp="2026-02-24 10:25:04 +0000 UTC" firstStartedPulling="2026-02-24 10:25:05.158206858 +0000 UTC m=+989.632399418" lastFinishedPulling="2026-02-24 10:25:11.167109854 +0000 UTC m=+995.641302444" observedRunningTime="2026-02-24 10:25:11.789035425 +0000 UTC m=+996.263228005" watchObservedRunningTime="2026-02-24 10:25:11.791104051 +0000 UTC m=+996.265296621" Feb 24 10:25:13 crc kubenswrapper[4985]: I0224 10:25:13.625509 4985 patch_prober.go:28] interesting pod/machine-config-daemon-hq52w container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 24 10:25:13 crc kubenswrapper[4985]: I0224 10:25:13.625913 4985 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hq52w" podUID="11c1c7b8-18df-4583-849f-76b62544344b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 24 10:25:14 crc kubenswrapper[4985]: I0224 10:25:14.652097 4985 ???:1] "http: TLS handshake error from 192.168.126.11:41916: no serving certificate available for the kubelet" Feb 24 10:25:18 crc kubenswrapper[4985]: I0224 10:25:18.806275 4985 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-dbbxs"] Feb 24 10:25:18 crc kubenswrapper[4985]: I0224 10:25:18.809981 4985 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-dbbxs" Feb 24 10:25:18 crc kubenswrapper[4985]: I0224 10:25:18.842030 4985 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-dbbxs"] Feb 24 10:25:18 crc kubenswrapper[4985]: I0224 10:25:18.898824 4985 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kh87x\" (UniqueName: \"kubernetes.io/projected/1c169a23-e0e7-4424-9589-4efdf739605f-kube-api-access-kh87x\") pod \"community-operators-dbbxs\" (UID: \"1c169a23-e0e7-4424-9589-4efdf739605f\") " pod="openshift-marketplace/community-operators-dbbxs" Feb 24 10:25:18 crc kubenswrapper[4985]: I0224 10:25:18.898914 4985 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1c169a23-e0e7-4424-9589-4efdf739605f-utilities\") pod \"community-operators-dbbxs\" (UID: \"1c169a23-e0e7-4424-9589-4efdf739605f\") " pod="openshift-marketplace/community-operators-dbbxs" Feb 24 10:25:18 crc kubenswrapper[4985]: I0224 10:25:18.898974 4985 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1c169a23-e0e7-4424-9589-4efdf739605f-catalog-content\") pod \"community-operators-dbbxs\" (UID: \"1c169a23-e0e7-4424-9589-4efdf739605f\") " pod="openshift-marketplace/community-operators-dbbxs" Feb 24 10:25:19 crc kubenswrapper[4985]: I0224 10:25:19.000511 4985 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kh87x\" (UniqueName: \"kubernetes.io/projected/1c169a23-e0e7-4424-9589-4efdf739605f-kube-api-access-kh87x\") pod \"community-operators-dbbxs\" (UID: \"1c169a23-e0e7-4424-9589-4efdf739605f\") " pod="openshift-marketplace/community-operators-dbbxs" Feb 24 10:25:19 crc kubenswrapper[4985]: I0224 10:25:19.000583 4985 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1c169a23-e0e7-4424-9589-4efdf739605f-utilities\") pod \"community-operators-dbbxs\" (UID: \"1c169a23-e0e7-4424-9589-4efdf739605f\") " pod="openshift-marketplace/community-operators-dbbxs" Feb 24 10:25:19 crc kubenswrapper[4985]: I0224 10:25:19.000618 4985 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1c169a23-e0e7-4424-9589-4efdf739605f-catalog-content\") pod \"community-operators-dbbxs\" (UID: \"1c169a23-e0e7-4424-9589-4efdf739605f\") " pod="openshift-marketplace/community-operators-dbbxs" Feb 24 10:25:19 crc kubenswrapper[4985]: I0224 10:25:19.001175 4985 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1c169a23-e0e7-4424-9589-4efdf739605f-utilities\") pod \"community-operators-dbbxs\" (UID: \"1c169a23-e0e7-4424-9589-4efdf739605f\") " pod="openshift-marketplace/community-operators-dbbxs" Feb 24 10:25:19 crc kubenswrapper[4985]: I0224 10:25:19.001251 4985 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1c169a23-e0e7-4424-9589-4efdf739605f-catalog-content\") pod \"community-operators-dbbxs\" (UID: \"1c169a23-e0e7-4424-9589-4efdf739605f\") " pod="openshift-marketplace/community-operators-dbbxs" Feb 24 10:25:19 crc kubenswrapper[4985]: I0224 10:25:19.029169 4985 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kh87x\" (UniqueName: \"kubernetes.io/projected/1c169a23-e0e7-4424-9589-4efdf739605f-kube-api-access-kh87x\") pod \"community-operators-dbbxs\" (UID: \"1c169a23-e0e7-4424-9589-4efdf739605f\") " pod="openshift-marketplace/community-operators-dbbxs" Feb 24 10:25:19 crc kubenswrapper[4985]: I0224 10:25:19.153066 4985 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-dbbxs" Feb 24 10:25:19 crc kubenswrapper[4985]: I0224 10:25:19.592542 4985 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-dbbxs"] Feb 24 10:25:19 crc kubenswrapper[4985]: I0224 10:25:19.836203 4985 generic.go:334] "Generic (PLEG): container finished" podID="1c169a23-e0e7-4424-9589-4efdf739605f" containerID="75f9fcec5c9450058f4935935f0971d12e04b000a398045dfefdafd8bd318f43" exitCode=0 Feb 24 10:25:19 crc kubenswrapper[4985]: I0224 10:25:19.836278 4985 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-dbbxs" event={"ID":"1c169a23-e0e7-4424-9589-4efdf739605f","Type":"ContainerDied","Data":"75f9fcec5c9450058f4935935f0971d12e04b000a398045dfefdafd8bd318f43"} Feb 24 10:25:19 crc kubenswrapper[4985]: I0224 10:25:19.836349 4985 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-dbbxs" event={"ID":"1c169a23-e0e7-4424-9589-4efdf739605f","Type":"ContainerStarted","Data":"9e02a2dbd8f7d7cb804fecf57a5f453732353a4552566e9d73f0b7434e720444"} Feb 24 10:25:20 crc kubenswrapper[4985]: I0224 10:25:20.843838 4985 generic.go:334] "Generic (PLEG): container finished" podID="1c169a23-e0e7-4424-9589-4efdf739605f" containerID="1a5c811ccec674a8f6f4040d9234d6f537b2d03ed2b38cebe11e77d1334569c6" exitCode=0 Feb 24 10:25:20 crc kubenswrapper[4985]: I0224 10:25:20.844041 4985 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-dbbxs" event={"ID":"1c169a23-e0e7-4424-9589-4efdf739605f","Type":"ContainerDied","Data":"1a5c811ccec674a8f6f4040d9234d6f537b2d03ed2b38cebe11e77d1334569c6"} Feb 24 10:25:21 crc kubenswrapper[4985]: I0224 10:25:21.852056 4985 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-dbbxs" event={"ID":"1c169a23-e0e7-4424-9589-4efdf739605f","Type":"ContainerStarted","Data":"ce095242b147545727e42a62a51106776ceafe21de3e4c908af0db8aefde0390"} Feb 24 10:25:21 crc kubenswrapper[4985]: I0224 10:25:21.870727 4985 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-dbbxs" podStartSLOduration=2.4886113979999998 podStartE2EDuration="3.870710291s" podCreationTimestamp="2026-02-24 10:25:18 +0000 UTC" firstStartedPulling="2026-02-24 10:25:19.837457901 +0000 UTC m=+1004.311650471" lastFinishedPulling="2026-02-24 10:25:21.219556794 +0000 UTC m=+1005.693749364" observedRunningTime="2026-02-24 10:25:21.867919214 +0000 UTC m=+1006.342111804" watchObservedRunningTime="2026-02-24 10:25:21.870710291 +0000 UTC m=+1006.344902851" Feb 24 10:25:22 crc kubenswrapper[4985]: I0224 10:25:22.578209 4985 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-tgmx9"] Feb 24 10:25:22 crc kubenswrapper[4985]: I0224 10:25:22.579159 4985 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-tgmx9" Feb 24 10:25:22 crc kubenswrapper[4985]: I0224 10:25:22.641199 4985 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-tgmx9"] Feb 24 10:25:22 crc kubenswrapper[4985]: I0224 10:25:22.645379 4985 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e5a3df9e-8835-48e1-ab21-bf367b97f7dd-catalog-content\") pod \"redhat-operators-tgmx9\" (UID: \"e5a3df9e-8835-48e1-ab21-bf367b97f7dd\") " pod="openshift-marketplace/redhat-operators-tgmx9" Feb 24 10:25:22 crc kubenswrapper[4985]: I0224 10:25:22.645465 4985 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e5a3df9e-8835-48e1-ab21-bf367b97f7dd-utilities\") pod \"redhat-operators-tgmx9\" (UID: \"e5a3df9e-8835-48e1-ab21-bf367b97f7dd\") " pod="openshift-marketplace/redhat-operators-tgmx9" Feb 24 10:25:22 crc kubenswrapper[4985]: I0224 10:25:22.645545 4985 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jgsvr\" (UniqueName: \"kubernetes.io/projected/e5a3df9e-8835-48e1-ab21-bf367b97f7dd-kube-api-access-jgsvr\") pod \"redhat-operators-tgmx9\" (UID: \"e5a3df9e-8835-48e1-ab21-bf367b97f7dd\") " pod="openshift-marketplace/redhat-operators-tgmx9" Feb 24 10:25:22 crc kubenswrapper[4985]: I0224 10:25:22.747524 4985 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e5a3df9e-8835-48e1-ab21-bf367b97f7dd-catalog-content\") pod \"redhat-operators-tgmx9\" (UID: \"e5a3df9e-8835-48e1-ab21-bf367b97f7dd\") " pod="openshift-marketplace/redhat-operators-tgmx9" Feb 24 10:25:22 crc kubenswrapper[4985]: I0224 10:25:22.747974 4985 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e5a3df9e-8835-48e1-ab21-bf367b97f7dd-utilities\") pod \"redhat-operators-tgmx9\" (UID: \"e5a3df9e-8835-48e1-ab21-bf367b97f7dd\") " pod="openshift-marketplace/redhat-operators-tgmx9" Feb 24 10:25:22 crc kubenswrapper[4985]: I0224 10:25:22.748029 4985 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jgsvr\" (UniqueName: \"kubernetes.io/projected/e5a3df9e-8835-48e1-ab21-bf367b97f7dd-kube-api-access-jgsvr\") pod \"redhat-operators-tgmx9\" (UID: \"e5a3df9e-8835-48e1-ab21-bf367b97f7dd\") " pod="openshift-marketplace/redhat-operators-tgmx9" Feb 24 10:25:22 crc kubenswrapper[4985]: I0224 10:25:22.748043 4985 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e5a3df9e-8835-48e1-ab21-bf367b97f7dd-catalog-content\") pod \"redhat-operators-tgmx9\" (UID: \"e5a3df9e-8835-48e1-ab21-bf367b97f7dd\") " pod="openshift-marketplace/redhat-operators-tgmx9" Feb 24 10:25:22 crc kubenswrapper[4985]: I0224 10:25:22.748349 4985 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e5a3df9e-8835-48e1-ab21-bf367b97f7dd-utilities\") pod \"redhat-operators-tgmx9\" (UID: \"e5a3df9e-8835-48e1-ab21-bf367b97f7dd\") " pod="openshift-marketplace/redhat-operators-tgmx9" Feb 24 10:25:22 crc kubenswrapper[4985]: I0224 10:25:22.772701 4985 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jgsvr\" (UniqueName: \"kubernetes.io/projected/e5a3df9e-8835-48e1-ab21-bf367b97f7dd-kube-api-access-jgsvr\") pod \"redhat-operators-tgmx9\" (UID: \"e5a3df9e-8835-48e1-ab21-bf367b97f7dd\") " pod="openshift-marketplace/redhat-operators-tgmx9" Feb 24 10:25:22 crc kubenswrapper[4985]: I0224 10:25:22.904521 4985 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-tgmx9" Feb 24 10:25:23 crc kubenswrapper[4985]: I0224 10:25:23.112458 4985 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-tgmx9"] Feb 24 10:25:23 crc kubenswrapper[4985]: I0224 10:25:23.864810 4985 generic.go:334] "Generic (PLEG): container finished" podID="e5a3df9e-8835-48e1-ab21-bf367b97f7dd" containerID="5c7751884635f8e6ea1de77826a8fc08cb487c06bc943bd30df76a68541e9aaa" exitCode=0 Feb 24 10:25:23 crc kubenswrapper[4985]: I0224 10:25:23.864903 4985 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tgmx9" event={"ID":"e5a3df9e-8835-48e1-ab21-bf367b97f7dd","Type":"ContainerDied","Data":"5c7751884635f8e6ea1de77826a8fc08cb487c06bc943bd30df76a68541e9aaa"} Feb 24 10:25:23 crc kubenswrapper[4985]: I0224 10:25:23.865453 4985 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tgmx9" event={"ID":"e5a3df9e-8835-48e1-ab21-bf367b97f7dd","Type":"ContainerStarted","Data":"c59f7b1317fe626c8623c7d74a22efc7b17793a9f50250e23fea2d405f533bfa"} Feb 24 10:25:24 crc kubenswrapper[4985]: I0224 10:25:24.872819 4985 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tgmx9" event={"ID":"e5a3df9e-8835-48e1-ab21-bf367b97f7dd","Type":"ContainerStarted","Data":"8482d6d24d7317098f0eb81525ec5ff71c40cdd3dbe672a6abebc0e8472aa5fe"} Feb 24 10:25:24 crc kubenswrapper[4985]: I0224 10:25:24.977417 4985 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-sjbps"] Feb 24 10:25:24 crc kubenswrapper[4985]: I0224 10:25:24.978850 4985 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-sjbps" Feb 24 10:25:25 crc kubenswrapper[4985]: I0224 10:25:25.029750 4985 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-sjbps"] Feb 24 10:25:25 crc kubenswrapper[4985]: I0224 10:25:25.073774 4985 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9892cb21-c7d9-4d52-ba49-3400d06437b0-utilities\") pod \"redhat-marketplace-sjbps\" (UID: \"9892cb21-c7d9-4d52-ba49-3400d06437b0\") " pod="openshift-marketplace/redhat-marketplace-sjbps" Feb 24 10:25:25 crc kubenswrapper[4985]: I0224 10:25:25.073852 4985 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9892cb21-c7d9-4d52-ba49-3400d06437b0-catalog-content\") pod \"redhat-marketplace-sjbps\" (UID: \"9892cb21-c7d9-4d52-ba49-3400d06437b0\") " pod="openshift-marketplace/redhat-marketplace-sjbps" Feb 24 10:25:25 crc kubenswrapper[4985]: I0224 10:25:25.073946 4985 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mn9dp\" (UniqueName: \"kubernetes.io/projected/9892cb21-c7d9-4d52-ba49-3400d06437b0-kube-api-access-mn9dp\") pod \"redhat-marketplace-sjbps\" (UID: \"9892cb21-c7d9-4d52-ba49-3400d06437b0\") " pod="openshift-marketplace/redhat-marketplace-sjbps" Feb 24 10:25:25 crc kubenswrapper[4985]: I0224 10:25:25.175170 4985 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mn9dp\" (UniqueName: \"kubernetes.io/projected/9892cb21-c7d9-4d52-ba49-3400d06437b0-kube-api-access-mn9dp\") pod \"redhat-marketplace-sjbps\" (UID: \"9892cb21-c7d9-4d52-ba49-3400d06437b0\") " pod="openshift-marketplace/redhat-marketplace-sjbps" Feb 24 10:25:25 crc kubenswrapper[4985]: I0224 10:25:25.175444 4985 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9892cb21-c7d9-4d52-ba49-3400d06437b0-utilities\") pod \"redhat-marketplace-sjbps\" (UID: \"9892cb21-c7d9-4d52-ba49-3400d06437b0\") " pod="openshift-marketplace/redhat-marketplace-sjbps" Feb 24 10:25:25 crc kubenswrapper[4985]: I0224 10:25:25.175568 4985 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9892cb21-c7d9-4d52-ba49-3400d06437b0-catalog-content\") pod \"redhat-marketplace-sjbps\" (UID: \"9892cb21-c7d9-4d52-ba49-3400d06437b0\") " pod="openshift-marketplace/redhat-marketplace-sjbps" Feb 24 10:25:25 crc kubenswrapper[4985]: I0224 10:25:25.175946 4985 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9892cb21-c7d9-4d52-ba49-3400d06437b0-utilities\") pod \"redhat-marketplace-sjbps\" (UID: \"9892cb21-c7d9-4d52-ba49-3400d06437b0\") " pod="openshift-marketplace/redhat-marketplace-sjbps" Feb 24 10:25:25 crc kubenswrapper[4985]: I0224 10:25:25.178991 4985 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9892cb21-c7d9-4d52-ba49-3400d06437b0-catalog-content\") pod \"redhat-marketplace-sjbps\" (UID: \"9892cb21-c7d9-4d52-ba49-3400d06437b0\") " pod="openshift-marketplace/redhat-marketplace-sjbps" Feb 24 10:25:25 crc kubenswrapper[4985]: I0224 10:25:25.214782 4985 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mn9dp\" (UniqueName: \"kubernetes.io/projected/9892cb21-c7d9-4d52-ba49-3400d06437b0-kube-api-access-mn9dp\") pod \"redhat-marketplace-sjbps\" (UID: \"9892cb21-c7d9-4d52-ba49-3400d06437b0\") " pod="openshift-marketplace/redhat-marketplace-sjbps" Feb 24 10:25:25 crc kubenswrapper[4985]: I0224 10:25:25.293577 4985 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-sjbps" Feb 24 10:25:25 crc kubenswrapper[4985]: I0224 10:25:25.519942 4985 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-sjbps"] Feb 24 10:25:25 crc kubenswrapper[4985]: W0224 10:25:25.523690 4985 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9892cb21_c7d9_4d52_ba49_3400d06437b0.slice/crio-6021c8717c8a383f6bc2f4e8d5cb7a804a19bad08430b0a6dd9530caaacaf3c0 WatchSource:0}: Error finding container 6021c8717c8a383f6bc2f4e8d5cb7a804a19bad08430b0a6dd9530caaacaf3c0: Status 404 returned error can't find the container with id 6021c8717c8a383f6bc2f4e8d5cb7a804a19bad08430b0a6dd9530caaacaf3c0 Feb 24 10:25:25 crc kubenswrapper[4985]: I0224 10:25:25.879861 4985 generic.go:334] "Generic (PLEG): container finished" podID="9892cb21-c7d9-4d52-ba49-3400d06437b0" containerID="ad1493d53b84caa59a79252e2edb9320e7cbacaf02496e5830498ca15418b05c" exitCode=0 Feb 24 10:25:25 crc kubenswrapper[4985]: I0224 10:25:25.880008 4985 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-sjbps" event={"ID":"9892cb21-c7d9-4d52-ba49-3400d06437b0","Type":"ContainerDied","Data":"ad1493d53b84caa59a79252e2edb9320e7cbacaf02496e5830498ca15418b05c"} Feb 24 10:25:25 crc kubenswrapper[4985]: I0224 10:25:25.880215 4985 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-sjbps" event={"ID":"9892cb21-c7d9-4d52-ba49-3400d06437b0","Type":"ContainerStarted","Data":"6021c8717c8a383f6bc2f4e8d5cb7a804a19bad08430b0a6dd9530caaacaf3c0"} Feb 24 10:25:25 crc kubenswrapper[4985]: I0224 10:25:25.883373 4985 generic.go:334] "Generic (PLEG): container finished" podID="e5a3df9e-8835-48e1-ab21-bf367b97f7dd" containerID="8482d6d24d7317098f0eb81525ec5ff71c40cdd3dbe672a6abebc0e8472aa5fe" exitCode=0 Feb 24 10:25:25 crc kubenswrapper[4985]: I0224 10:25:25.883410 4985 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tgmx9" event={"ID":"e5a3df9e-8835-48e1-ab21-bf367b97f7dd","Type":"ContainerDied","Data":"8482d6d24d7317098f0eb81525ec5ff71c40cdd3dbe672a6abebc0e8472aa5fe"} Feb 24 10:25:26 crc kubenswrapper[4985]: I0224 10:25:26.702064 4985 ???:1] "http: TLS handshake error from 192.168.126.11:38690: no serving certificate available for the kubelet" Feb 24 10:25:26 crc kubenswrapper[4985]: I0224 10:25:26.890100 4985 generic.go:334] "Generic (PLEG): container finished" podID="9892cb21-c7d9-4d52-ba49-3400d06437b0" containerID="f33e246829da39b9937f9d088a4b79ea6e81d8377d2241cc9b11b3a39a36276c" exitCode=0 Feb 24 10:25:26 crc kubenswrapper[4985]: I0224 10:25:26.890158 4985 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-sjbps" event={"ID":"9892cb21-c7d9-4d52-ba49-3400d06437b0","Type":"ContainerDied","Data":"f33e246829da39b9937f9d088a4b79ea6e81d8377d2241cc9b11b3a39a36276c"} Feb 24 10:25:26 crc kubenswrapper[4985]: I0224 10:25:26.893059 4985 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tgmx9" event={"ID":"e5a3df9e-8835-48e1-ab21-bf367b97f7dd","Type":"ContainerStarted","Data":"1ae5cf0b006e8168104d79a5f6e942865998590f55ed8e76e01890f7b64e6e33"} Feb 24 10:25:26 crc kubenswrapper[4985]: I0224 10:25:26.926870 4985 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-tgmx9" podStartSLOduration=2.4977559559999998 podStartE2EDuration="4.926853596s" podCreationTimestamp="2026-02-24 10:25:22 +0000 UTC" firstStartedPulling="2026-02-24 10:25:23.866665683 +0000 UTC m=+1008.340858243" lastFinishedPulling="2026-02-24 10:25:26.295763303 +0000 UTC m=+1010.769955883" observedRunningTime="2026-02-24 10:25:26.924878672 +0000 UTC m=+1011.399071252" watchObservedRunningTime="2026-02-24 10:25:26.926853596 +0000 UTC m=+1011.401046156" Feb 24 10:25:27 crc kubenswrapper[4985]: I0224 10:25:27.902846 4985 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-sjbps" event={"ID":"9892cb21-c7d9-4d52-ba49-3400d06437b0","Type":"ContainerStarted","Data":"612b33bf388592f0139301a1818381912a9dd8b7d37bf9ac16cea1ef3c4d514e"} Feb 24 10:25:29 crc kubenswrapper[4985]: I0224 10:25:29.153864 4985 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-dbbxs" Feb 24 10:25:29 crc kubenswrapper[4985]: I0224 10:25:29.154205 4985 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-dbbxs" Feb 24 10:25:29 crc kubenswrapper[4985]: I0224 10:25:29.206027 4985 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-dbbxs" Feb 24 10:25:29 crc kubenswrapper[4985]: I0224 10:25:29.237932 4985 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-sjbps" podStartSLOduration=3.829282188 podStartE2EDuration="5.237880891s" podCreationTimestamp="2026-02-24 10:25:24 +0000 UTC" firstStartedPulling="2026-02-24 10:25:25.881558636 +0000 UTC m=+1010.355751196" lastFinishedPulling="2026-02-24 10:25:27.290157339 +0000 UTC m=+1011.764349899" observedRunningTime="2026-02-24 10:25:27.923050783 +0000 UTC m=+1012.397243363" watchObservedRunningTime="2026-02-24 10:25:29.237880891 +0000 UTC m=+1013.712073451" Feb 24 10:25:29 crc kubenswrapper[4985]: I0224 10:25:29.956618 4985 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-dbbxs" Feb 24 10:25:31 crc kubenswrapper[4985]: I0224 10:25:31.576799 4985 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-dbbxs"] Feb 24 10:25:31 crc kubenswrapper[4985]: I0224 10:25:31.925292 4985 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-dbbxs" podUID="1c169a23-e0e7-4424-9589-4efdf739605f" containerName="registry-server" containerID="cri-o://ce095242b147545727e42a62a51106776ceafe21de3e4c908af0db8aefde0390" gracePeriod=2 Feb 24 10:25:32 crc kubenswrapper[4985]: I0224 10:25:32.905591 4985 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-tgmx9" Feb 24 10:25:32 crc kubenswrapper[4985]: I0224 10:25:32.905657 4985 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-tgmx9" Feb 24 10:25:32 crc kubenswrapper[4985]: I0224 10:25:32.962649 4985 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-tgmx9" Feb 24 10:25:33 crc kubenswrapper[4985]: I0224 10:25:33.029907 4985 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-tgmx9" Feb 24 10:25:33 crc kubenswrapper[4985]: I0224 10:25:33.773822 4985 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-dbbxs" Feb 24 10:25:33 crc kubenswrapper[4985]: I0224 10:25:33.879330 4985 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1c169a23-e0e7-4424-9589-4efdf739605f-utilities\") pod \"1c169a23-e0e7-4424-9589-4efdf739605f\" (UID: \"1c169a23-e0e7-4424-9589-4efdf739605f\") " Feb 24 10:25:33 crc kubenswrapper[4985]: I0224 10:25:33.879464 4985 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kh87x\" (UniqueName: \"kubernetes.io/projected/1c169a23-e0e7-4424-9589-4efdf739605f-kube-api-access-kh87x\") pod \"1c169a23-e0e7-4424-9589-4efdf739605f\" (UID: \"1c169a23-e0e7-4424-9589-4efdf739605f\") " Feb 24 10:25:33 crc kubenswrapper[4985]: I0224 10:25:33.879499 4985 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1c169a23-e0e7-4424-9589-4efdf739605f-catalog-content\") pod \"1c169a23-e0e7-4424-9589-4efdf739605f\" (UID: \"1c169a23-e0e7-4424-9589-4efdf739605f\") " Feb 24 10:25:33 crc kubenswrapper[4985]: I0224 10:25:33.880830 4985 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1c169a23-e0e7-4424-9589-4efdf739605f-utilities" (OuterVolumeSpecName: "utilities") pod "1c169a23-e0e7-4424-9589-4efdf739605f" (UID: "1c169a23-e0e7-4424-9589-4efdf739605f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 24 10:25:33 crc kubenswrapper[4985]: I0224 10:25:33.895027 4985 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1c169a23-e0e7-4424-9589-4efdf739605f-kube-api-access-kh87x" (OuterVolumeSpecName: "kube-api-access-kh87x") pod "1c169a23-e0e7-4424-9589-4efdf739605f" (UID: "1c169a23-e0e7-4424-9589-4efdf739605f"). InnerVolumeSpecName "kube-api-access-kh87x". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 24 10:25:33 crc kubenswrapper[4985]: I0224 10:25:33.941684 4985 generic.go:334] "Generic (PLEG): container finished" podID="1c169a23-e0e7-4424-9589-4efdf739605f" containerID="ce095242b147545727e42a62a51106776ceafe21de3e4c908af0db8aefde0390" exitCode=0 Feb 24 10:25:33 crc kubenswrapper[4985]: I0224 10:25:33.941728 4985 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-dbbxs" event={"ID":"1c169a23-e0e7-4424-9589-4efdf739605f","Type":"ContainerDied","Data":"ce095242b147545727e42a62a51106776ceafe21de3e4c908af0db8aefde0390"} Feb 24 10:25:33 crc kubenswrapper[4985]: I0224 10:25:33.941786 4985 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-dbbxs" event={"ID":"1c169a23-e0e7-4424-9589-4efdf739605f","Type":"ContainerDied","Data":"9e02a2dbd8f7d7cb804fecf57a5f453732353a4552566e9d73f0b7434e720444"} Feb 24 10:25:33 crc kubenswrapper[4985]: I0224 10:25:33.941810 4985 scope.go:117] "RemoveContainer" containerID="ce095242b147545727e42a62a51106776ceafe21de3e4c908af0db8aefde0390" Feb 24 10:25:33 crc kubenswrapper[4985]: I0224 10:25:33.941742 4985 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-dbbxs" Feb 24 10:25:33 crc kubenswrapper[4985]: I0224 10:25:33.962086 4985 scope.go:117] "RemoveContainer" containerID="1a5c811ccec674a8f6f4040d9234d6f537b2d03ed2b38cebe11e77d1334569c6" Feb 24 10:25:33 crc kubenswrapper[4985]: I0224 10:25:33.962369 4985 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1c169a23-e0e7-4424-9589-4efdf739605f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1c169a23-e0e7-4424-9589-4efdf739605f" (UID: "1c169a23-e0e7-4424-9589-4efdf739605f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 24 10:25:33 crc kubenswrapper[4985]: I0224 10:25:33.981520 4985 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kh87x\" (UniqueName: \"kubernetes.io/projected/1c169a23-e0e7-4424-9589-4efdf739605f-kube-api-access-kh87x\") on node \"crc\" DevicePath \"\"" Feb 24 10:25:33 crc kubenswrapper[4985]: I0224 10:25:33.981574 4985 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1c169a23-e0e7-4424-9589-4efdf739605f-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 24 10:25:33 crc kubenswrapper[4985]: I0224 10:25:33.981593 4985 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1c169a23-e0e7-4424-9589-4efdf739605f-utilities\") on node \"crc\" DevicePath \"\"" Feb 24 10:25:33 crc kubenswrapper[4985]: I0224 10:25:33.984580 4985 scope.go:117] "RemoveContainer" containerID="75f9fcec5c9450058f4935935f0971d12e04b000a398045dfefdafd8bd318f43" Feb 24 10:25:33 crc kubenswrapper[4985]: I0224 10:25:33.998953 4985 scope.go:117] "RemoveContainer" containerID="ce095242b147545727e42a62a51106776ceafe21de3e4c908af0db8aefde0390" Feb 24 10:25:33 crc kubenswrapper[4985]: E0224 10:25:33.999325 4985 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ce095242b147545727e42a62a51106776ceafe21de3e4c908af0db8aefde0390\": container with ID starting with ce095242b147545727e42a62a51106776ceafe21de3e4c908af0db8aefde0390 not found: ID does not exist" containerID="ce095242b147545727e42a62a51106776ceafe21de3e4c908af0db8aefde0390" Feb 24 10:25:33 crc kubenswrapper[4985]: I0224 10:25:33.999383 4985 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ce095242b147545727e42a62a51106776ceafe21de3e4c908af0db8aefde0390"} err="failed to get container status \"ce095242b147545727e42a62a51106776ceafe21de3e4c908af0db8aefde0390\": rpc error: code = NotFound desc = could not find container \"ce095242b147545727e42a62a51106776ceafe21de3e4c908af0db8aefde0390\": container with ID starting with ce095242b147545727e42a62a51106776ceafe21de3e4c908af0db8aefde0390 not found: ID does not exist" Feb 24 10:25:33 crc kubenswrapper[4985]: I0224 10:25:33.999408 4985 scope.go:117] "RemoveContainer" containerID="1a5c811ccec674a8f6f4040d9234d6f537b2d03ed2b38cebe11e77d1334569c6" Feb 24 10:25:34 crc kubenswrapper[4985]: E0224 10:25:33.999921 4985 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1a5c811ccec674a8f6f4040d9234d6f537b2d03ed2b38cebe11e77d1334569c6\": container with ID starting with 1a5c811ccec674a8f6f4040d9234d6f537b2d03ed2b38cebe11e77d1334569c6 not found: ID does not exist" containerID="1a5c811ccec674a8f6f4040d9234d6f537b2d03ed2b38cebe11e77d1334569c6" Feb 24 10:25:34 crc kubenswrapper[4985]: I0224 10:25:33.999963 4985 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1a5c811ccec674a8f6f4040d9234d6f537b2d03ed2b38cebe11e77d1334569c6"} err="failed to get container status \"1a5c811ccec674a8f6f4040d9234d6f537b2d03ed2b38cebe11e77d1334569c6\": rpc error: code = NotFound desc = could not find container \"1a5c811ccec674a8f6f4040d9234d6f537b2d03ed2b38cebe11e77d1334569c6\": container with ID starting with 1a5c811ccec674a8f6f4040d9234d6f537b2d03ed2b38cebe11e77d1334569c6 not found: ID does not exist" Feb 24 10:25:34 crc kubenswrapper[4985]: I0224 10:25:33.999987 4985 scope.go:117] "RemoveContainer" containerID="75f9fcec5c9450058f4935935f0971d12e04b000a398045dfefdafd8bd318f43" Feb 24 10:25:34 crc kubenswrapper[4985]: E0224 10:25:34.000558 4985 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"75f9fcec5c9450058f4935935f0971d12e04b000a398045dfefdafd8bd318f43\": container with ID starting with 75f9fcec5c9450058f4935935f0971d12e04b000a398045dfefdafd8bd318f43 not found: ID does not exist" containerID="75f9fcec5c9450058f4935935f0971d12e04b000a398045dfefdafd8bd318f43" Feb 24 10:25:34 crc kubenswrapper[4985]: I0224 10:25:34.000609 4985 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"75f9fcec5c9450058f4935935f0971d12e04b000a398045dfefdafd8bd318f43"} err="failed to get container status \"75f9fcec5c9450058f4935935f0971d12e04b000a398045dfefdafd8bd318f43\": rpc error: code = NotFound desc = could not find container \"75f9fcec5c9450058f4935935f0971d12e04b000a398045dfefdafd8bd318f43\": container with ID starting with 75f9fcec5c9450058f4935935f0971d12e04b000a398045dfefdafd8bd318f43 not found: ID does not exist" Feb 24 10:25:34 crc kubenswrapper[4985]: I0224 10:25:34.287231 4985 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-dbbxs"] Feb 24 10:25:34 crc kubenswrapper[4985]: I0224 10:25:34.290873 4985 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-dbbxs"] Feb 24 10:25:35 crc kubenswrapper[4985]: I0224 10:25:35.294175 4985 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-sjbps" Feb 24 10:25:35 crc kubenswrapper[4985]: I0224 10:25:35.294242 4985 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-sjbps" Feb 24 10:25:35 crc kubenswrapper[4985]: I0224 10:25:35.342286 4985 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-sjbps" Feb 24 10:25:35 crc kubenswrapper[4985]: I0224 10:25:35.382486 4985 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-tgmx9"] Feb 24 10:25:35 crc kubenswrapper[4985]: I0224 10:25:35.383138 4985 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-tgmx9" podUID="e5a3df9e-8835-48e1-ab21-bf367b97f7dd" containerName="registry-server" containerID="cri-o://1ae5cf0b006e8168104d79a5f6e942865998590f55ed8e76e01890f7b64e6e33" gracePeriod=2 Feb 24 10:25:35 crc kubenswrapper[4985]: I0224 10:25:35.999872 4985 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-sjbps" Feb 24 10:25:36 crc kubenswrapper[4985]: I0224 10:25:36.238122 4985 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-tgmx9" Feb 24 10:25:36 crc kubenswrapper[4985]: I0224 10:25:36.275467 4985 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1c169a23-e0e7-4424-9589-4efdf739605f" path="/var/lib/kubelet/pods/1c169a23-e0e7-4424-9589-4efdf739605f/volumes" Feb 24 10:25:36 crc kubenswrapper[4985]: I0224 10:25:36.440641 4985 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e5a3df9e-8835-48e1-ab21-bf367b97f7dd-catalog-content\") pod \"e5a3df9e-8835-48e1-ab21-bf367b97f7dd\" (UID: \"e5a3df9e-8835-48e1-ab21-bf367b97f7dd\") " Feb 24 10:25:36 crc kubenswrapper[4985]: I0224 10:25:36.442328 4985 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jgsvr\" (UniqueName: \"kubernetes.io/projected/e5a3df9e-8835-48e1-ab21-bf367b97f7dd-kube-api-access-jgsvr\") pod \"e5a3df9e-8835-48e1-ab21-bf367b97f7dd\" (UID: \"e5a3df9e-8835-48e1-ab21-bf367b97f7dd\") " Feb 24 10:25:36 crc kubenswrapper[4985]: I0224 10:25:36.442558 4985 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e5a3df9e-8835-48e1-ab21-bf367b97f7dd-utilities\") pod \"e5a3df9e-8835-48e1-ab21-bf367b97f7dd\" (UID: \"e5a3df9e-8835-48e1-ab21-bf367b97f7dd\") " Feb 24 10:25:36 crc kubenswrapper[4985]: I0224 10:25:36.443500 4985 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e5a3df9e-8835-48e1-ab21-bf367b97f7dd-utilities" (OuterVolumeSpecName: "utilities") pod "e5a3df9e-8835-48e1-ab21-bf367b97f7dd" (UID: "e5a3df9e-8835-48e1-ab21-bf367b97f7dd"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 24 10:25:36 crc kubenswrapper[4985]: I0224 10:25:36.451993 4985 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e5a3df9e-8835-48e1-ab21-bf367b97f7dd-kube-api-access-jgsvr" (OuterVolumeSpecName: "kube-api-access-jgsvr") pod "e5a3df9e-8835-48e1-ab21-bf367b97f7dd" (UID: "e5a3df9e-8835-48e1-ab21-bf367b97f7dd"). InnerVolumeSpecName "kube-api-access-jgsvr". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 24 10:25:36 crc kubenswrapper[4985]: I0224 10:25:36.544253 4985 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jgsvr\" (UniqueName: \"kubernetes.io/projected/e5a3df9e-8835-48e1-ab21-bf367b97f7dd-kube-api-access-jgsvr\") on node \"crc\" DevicePath \"\"" Feb 24 10:25:36 crc kubenswrapper[4985]: I0224 10:25:36.544286 4985 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e5a3df9e-8835-48e1-ab21-bf367b97f7dd-utilities\") on node \"crc\" DevicePath \"\"" Feb 24 10:25:36 crc kubenswrapper[4985]: I0224 10:25:36.550436 4985 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e5a3df9e-8835-48e1-ab21-bf367b97f7dd-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "e5a3df9e-8835-48e1-ab21-bf367b97f7dd" (UID: "e5a3df9e-8835-48e1-ab21-bf367b97f7dd"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 24 10:25:36 crc kubenswrapper[4985]: I0224 10:25:36.645245 4985 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e5a3df9e-8835-48e1-ab21-bf367b97f7dd-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 24 10:25:36 crc kubenswrapper[4985]: I0224 10:25:36.966282 4985 generic.go:334] "Generic (PLEG): container finished" podID="e5a3df9e-8835-48e1-ab21-bf367b97f7dd" containerID="1ae5cf0b006e8168104d79a5f6e942865998590f55ed8e76e01890f7b64e6e33" exitCode=0 Feb 24 10:25:36 crc kubenswrapper[4985]: I0224 10:25:36.966372 4985 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-tgmx9" Feb 24 10:25:36 crc kubenswrapper[4985]: I0224 10:25:36.966402 4985 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tgmx9" event={"ID":"e5a3df9e-8835-48e1-ab21-bf367b97f7dd","Type":"ContainerDied","Data":"1ae5cf0b006e8168104d79a5f6e942865998590f55ed8e76e01890f7b64e6e33"} Feb 24 10:25:36 crc kubenswrapper[4985]: I0224 10:25:36.966474 4985 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tgmx9" event={"ID":"e5a3df9e-8835-48e1-ab21-bf367b97f7dd","Type":"ContainerDied","Data":"c59f7b1317fe626c8623c7d74a22efc7b17793a9f50250e23fea2d405f533bfa"} Feb 24 10:25:36 crc kubenswrapper[4985]: I0224 10:25:36.966505 4985 scope.go:117] "RemoveContainer" containerID="1ae5cf0b006e8168104d79a5f6e942865998590f55ed8e76e01890f7b64e6e33" Feb 24 10:25:36 crc kubenswrapper[4985]: I0224 10:25:36.987806 4985 scope.go:117] "RemoveContainer" containerID="8482d6d24d7317098f0eb81525ec5ff71c40cdd3dbe672a6abebc0e8472aa5fe" Feb 24 10:25:37 crc kubenswrapper[4985]: I0224 10:25:37.017636 4985 scope.go:117] "RemoveContainer" containerID="5c7751884635f8e6ea1de77826a8fc08cb487c06bc943bd30df76a68541e9aaa" Feb 24 10:25:37 crc kubenswrapper[4985]: I0224 10:25:37.026985 4985 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-tgmx9"] Feb 24 10:25:37 crc kubenswrapper[4985]: I0224 10:25:37.031500 4985 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-tgmx9"] Feb 24 10:25:37 crc kubenswrapper[4985]: I0224 10:25:37.055038 4985 scope.go:117] "RemoveContainer" containerID="1ae5cf0b006e8168104d79a5f6e942865998590f55ed8e76e01890f7b64e6e33" Feb 24 10:25:37 crc kubenswrapper[4985]: E0224 10:25:37.055693 4985 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1ae5cf0b006e8168104d79a5f6e942865998590f55ed8e76e01890f7b64e6e33\": container with ID starting with 1ae5cf0b006e8168104d79a5f6e942865998590f55ed8e76e01890f7b64e6e33 not found: ID does not exist" containerID="1ae5cf0b006e8168104d79a5f6e942865998590f55ed8e76e01890f7b64e6e33" Feb 24 10:25:37 crc kubenswrapper[4985]: I0224 10:25:37.055727 4985 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1ae5cf0b006e8168104d79a5f6e942865998590f55ed8e76e01890f7b64e6e33"} err="failed to get container status \"1ae5cf0b006e8168104d79a5f6e942865998590f55ed8e76e01890f7b64e6e33\": rpc error: code = NotFound desc = could not find container \"1ae5cf0b006e8168104d79a5f6e942865998590f55ed8e76e01890f7b64e6e33\": container with ID starting with 1ae5cf0b006e8168104d79a5f6e942865998590f55ed8e76e01890f7b64e6e33 not found: ID does not exist" Feb 24 10:25:37 crc kubenswrapper[4985]: I0224 10:25:37.055751 4985 scope.go:117] "RemoveContainer" containerID="8482d6d24d7317098f0eb81525ec5ff71c40cdd3dbe672a6abebc0e8472aa5fe" Feb 24 10:25:37 crc kubenswrapper[4985]: E0224 10:25:37.056199 4985 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8482d6d24d7317098f0eb81525ec5ff71c40cdd3dbe672a6abebc0e8472aa5fe\": container with ID starting with 8482d6d24d7317098f0eb81525ec5ff71c40cdd3dbe672a6abebc0e8472aa5fe not found: ID does not exist" containerID="8482d6d24d7317098f0eb81525ec5ff71c40cdd3dbe672a6abebc0e8472aa5fe" Feb 24 10:25:37 crc kubenswrapper[4985]: I0224 10:25:37.056237 4985 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8482d6d24d7317098f0eb81525ec5ff71c40cdd3dbe672a6abebc0e8472aa5fe"} err="failed to get container status \"8482d6d24d7317098f0eb81525ec5ff71c40cdd3dbe672a6abebc0e8472aa5fe\": rpc error: code = NotFound desc = could not find container \"8482d6d24d7317098f0eb81525ec5ff71c40cdd3dbe672a6abebc0e8472aa5fe\": container with ID starting with 8482d6d24d7317098f0eb81525ec5ff71c40cdd3dbe672a6abebc0e8472aa5fe not found: ID does not exist" Feb 24 10:25:37 crc kubenswrapper[4985]: I0224 10:25:37.056263 4985 scope.go:117] "RemoveContainer" containerID="5c7751884635f8e6ea1de77826a8fc08cb487c06bc943bd30df76a68541e9aaa" Feb 24 10:25:37 crc kubenswrapper[4985]: E0224 10:25:37.056653 4985 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5c7751884635f8e6ea1de77826a8fc08cb487c06bc943bd30df76a68541e9aaa\": container with ID starting with 5c7751884635f8e6ea1de77826a8fc08cb487c06bc943bd30df76a68541e9aaa not found: ID does not exist" containerID="5c7751884635f8e6ea1de77826a8fc08cb487c06bc943bd30df76a68541e9aaa" Feb 24 10:25:37 crc kubenswrapper[4985]: I0224 10:25:37.056672 4985 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5c7751884635f8e6ea1de77826a8fc08cb487c06bc943bd30df76a68541e9aaa"} err="failed to get container status \"5c7751884635f8e6ea1de77826a8fc08cb487c06bc943bd30df76a68541e9aaa\": rpc error: code = NotFound desc = could not find container \"5c7751884635f8e6ea1de77826a8fc08cb487c06bc943bd30df76a68541e9aaa\": container with ID starting with 5c7751884635f8e6ea1de77826a8fc08cb487c06bc943bd30df76a68541e9aaa not found: ID does not exist" Feb 24 10:25:37 crc kubenswrapper[4985]: I0224 10:25:37.777033 4985 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-sjbps"] Feb 24 10:25:37 crc kubenswrapper[4985]: I0224 10:25:37.972251 4985 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-sjbps" podUID="9892cb21-c7d9-4d52-ba49-3400d06437b0" containerName="registry-server" containerID="cri-o://612b33bf388592f0139301a1818381912a9dd8b7d37bf9ac16cea1ef3c4d514e" gracePeriod=2 Feb 24 10:25:38 crc kubenswrapper[4985]: I0224 10:25:38.272883 4985 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e5a3df9e-8835-48e1-ab21-bf367b97f7dd" path="/var/lib/kubelet/pods/e5a3df9e-8835-48e1-ab21-bf367b97f7dd/volumes" Feb 24 10:25:38 crc kubenswrapper[4985]: I0224 10:25:38.323948 4985 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-sjbps" Feb 24 10:25:38 crc kubenswrapper[4985]: I0224 10:25:38.466416 4985 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9892cb21-c7d9-4d52-ba49-3400d06437b0-utilities\") pod \"9892cb21-c7d9-4d52-ba49-3400d06437b0\" (UID: \"9892cb21-c7d9-4d52-ba49-3400d06437b0\") " Feb 24 10:25:38 crc kubenswrapper[4985]: I0224 10:25:38.466469 4985 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9892cb21-c7d9-4d52-ba49-3400d06437b0-catalog-content\") pod \"9892cb21-c7d9-4d52-ba49-3400d06437b0\" (UID: \"9892cb21-c7d9-4d52-ba49-3400d06437b0\") " Feb 24 10:25:38 crc kubenswrapper[4985]: I0224 10:25:38.466559 4985 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mn9dp\" (UniqueName: \"kubernetes.io/projected/9892cb21-c7d9-4d52-ba49-3400d06437b0-kube-api-access-mn9dp\") pod \"9892cb21-c7d9-4d52-ba49-3400d06437b0\" (UID: \"9892cb21-c7d9-4d52-ba49-3400d06437b0\") " Feb 24 10:25:38 crc kubenswrapper[4985]: I0224 10:25:38.467271 4985 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9892cb21-c7d9-4d52-ba49-3400d06437b0-utilities" (OuterVolumeSpecName: "utilities") pod "9892cb21-c7d9-4d52-ba49-3400d06437b0" (UID: "9892cb21-c7d9-4d52-ba49-3400d06437b0"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 24 10:25:38 crc kubenswrapper[4985]: I0224 10:25:38.473018 4985 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9892cb21-c7d9-4d52-ba49-3400d06437b0-kube-api-access-mn9dp" (OuterVolumeSpecName: "kube-api-access-mn9dp") pod "9892cb21-c7d9-4d52-ba49-3400d06437b0" (UID: "9892cb21-c7d9-4d52-ba49-3400d06437b0"). InnerVolumeSpecName "kube-api-access-mn9dp". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 24 10:25:38 crc kubenswrapper[4985]: I0224 10:25:38.490994 4985 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9892cb21-c7d9-4d52-ba49-3400d06437b0-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "9892cb21-c7d9-4d52-ba49-3400d06437b0" (UID: "9892cb21-c7d9-4d52-ba49-3400d06437b0"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 24 10:25:38 crc kubenswrapper[4985]: I0224 10:25:38.568331 4985 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mn9dp\" (UniqueName: \"kubernetes.io/projected/9892cb21-c7d9-4d52-ba49-3400d06437b0-kube-api-access-mn9dp\") on node \"crc\" DevicePath \"\"" Feb 24 10:25:38 crc kubenswrapper[4985]: I0224 10:25:38.568366 4985 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9892cb21-c7d9-4d52-ba49-3400d06437b0-utilities\") on node \"crc\" DevicePath \"\"" Feb 24 10:25:38 crc kubenswrapper[4985]: I0224 10:25:38.568374 4985 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9892cb21-c7d9-4d52-ba49-3400d06437b0-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 24 10:25:38 crc kubenswrapper[4985]: I0224 10:25:38.981404 4985 generic.go:334] "Generic (PLEG): container finished" podID="9892cb21-c7d9-4d52-ba49-3400d06437b0" containerID="612b33bf388592f0139301a1818381912a9dd8b7d37bf9ac16cea1ef3c4d514e" exitCode=0 Feb 24 10:25:38 crc kubenswrapper[4985]: I0224 10:25:38.981461 4985 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-sjbps" event={"ID":"9892cb21-c7d9-4d52-ba49-3400d06437b0","Type":"ContainerDied","Data":"612b33bf388592f0139301a1818381912a9dd8b7d37bf9ac16cea1ef3c4d514e"} Feb 24 10:25:38 crc kubenswrapper[4985]: I0224 10:25:38.981497 4985 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-sjbps" event={"ID":"9892cb21-c7d9-4d52-ba49-3400d06437b0","Type":"ContainerDied","Data":"6021c8717c8a383f6bc2f4e8d5cb7a804a19bad08430b0a6dd9530caaacaf3c0"} Feb 24 10:25:38 crc kubenswrapper[4985]: I0224 10:25:38.981526 4985 scope.go:117] "RemoveContainer" containerID="612b33bf388592f0139301a1818381912a9dd8b7d37bf9ac16cea1ef3c4d514e" Feb 24 10:25:38 crc kubenswrapper[4985]: I0224 10:25:38.981725 4985 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-sjbps" Feb 24 10:25:39 crc kubenswrapper[4985]: I0224 10:25:39.010014 4985 scope.go:117] "RemoveContainer" containerID="f33e246829da39b9937f9d088a4b79ea6e81d8377d2241cc9b11b3a39a36276c" Feb 24 10:25:39 crc kubenswrapper[4985]: I0224 10:25:39.022547 4985 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-sjbps"] Feb 24 10:25:39 crc kubenswrapper[4985]: I0224 10:25:39.030446 4985 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-sjbps"] Feb 24 10:25:39 crc kubenswrapper[4985]: I0224 10:25:39.041537 4985 scope.go:117] "RemoveContainer" containerID="ad1493d53b84caa59a79252e2edb9320e7cbacaf02496e5830498ca15418b05c" Feb 24 10:25:39 crc kubenswrapper[4985]: I0224 10:25:39.056794 4985 scope.go:117] "RemoveContainer" containerID="612b33bf388592f0139301a1818381912a9dd8b7d37bf9ac16cea1ef3c4d514e" Feb 24 10:25:39 crc kubenswrapper[4985]: E0224 10:25:39.057190 4985 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"612b33bf388592f0139301a1818381912a9dd8b7d37bf9ac16cea1ef3c4d514e\": container with ID starting with 612b33bf388592f0139301a1818381912a9dd8b7d37bf9ac16cea1ef3c4d514e not found: ID does not exist" containerID="612b33bf388592f0139301a1818381912a9dd8b7d37bf9ac16cea1ef3c4d514e" Feb 24 10:25:39 crc kubenswrapper[4985]: I0224 10:25:39.057234 4985 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"612b33bf388592f0139301a1818381912a9dd8b7d37bf9ac16cea1ef3c4d514e"} err="failed to get container status \"612b33bf388592f0139301a1818381912a9dd8b7d37bf9ac16cea1ef3c4d514e\": rpc error: code = NotFound desc = could not find container \"612b33bf388592f0139301a1818381912a9dd8b7d37bf9ac16cea1ef3c4d514e\": container with ID starting with 612b33bf388592f0139301a1818381912a9dd8b7d37bf9ac16cea1ef3c4d514e not found: ID does not exist" Feb 24 10:25:39 crc kubenswrapper[4985]: I0224 10:25:39.057260 4985 scope.go:117] "RemoveContainer" containerID="f33e246829da39b9937f9d088a4b79ea6e81d8377d2241cc9b11b3a39a36276c" Feb 24 10:25:39 crc kubenswrapper[4985]: E0224 10:25:39.058180 4985 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f33e246829da39b9937f9d088a4b79ea6e81d8377d2241cc9b11b3a39a36276c\": container with ID starting with f33e246829da39b9937f9d088a4b79ea6e81d8377d2241cc9b11b3a39a36276c not found: ID does not exist" containerID="f33e246829da39b9937f9d088a4b79ea6e81d8377d2241cc9b11b3a39a36276c" Feb 24 10:25:39 crc kubenswrapper[4985]: I0224 10:25:39.058211 4985 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f33e246829da39b9937f9d088a4b79ea6e81d8377d2241cc9b11b3a39a36276c"} err="failed to get container status \"f33e246829da39b9937f9d088a4b79ea6e81d8377d2241cc9b11b3a39a36276c\": rpc error: code = NotFound desc = could not find container \"f33e246829da39b9937f9d088a4b79ea6e81d8377d2241cc9b11b3a39a36276c\": container with ID starting with f33e246829da39b9937f9d088a4b79ea6e81d8377d2241cc9b11b3a39a36276c not found: ID does not exist" Feb 24 10:25:39 crc kubenswrapper[4985]: I0224 10:25:39.058236 4985 scope.go:117] "RemoveContainer" containerID="ad1493d53b84caa59a79252e2edb9320e7cbacaf02496e5830498ca15418b05c" Feb 24 10:25:39 crc kubenswrapper[4985]: E0224 10:25:39.058420 4985 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ad1493d53b84caa59a79252e2edb9320e7cbacaf02496e5830498ca15418b05c\": container with ID starting with ad1493d53b84caa59a79252e2edb9320e7cbacaf02496e5830498ca15418b05c not found: ID does not exist" containerID="ad1493d53b84caa59a79252e2edb9320e7cbacaf02496e5830498ca15418b05c" Feb 24 10:25:39 crc kubenswrapper[4985]: I0224 10:25:39.058440 4985 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ad1493d53b84caa59a79252e2edb9320e7cbacaf02496e5830498ca15418b05c"} err="failed to get container status \"ad1493d53b84caa59a79252e2edb9320e7cbacaf02496e5830498ca15418b05c\": rpc error: code = NotFound desc = could not find container \"ad1493d53b84caa59a79252e2edb9320e7cbacaf02496e5830498ca15418b05c\": container with ID starting with ad1493d53b84caa59a79252e2edb9320e7cbacaf02496e5830498ca15418b05c not found: ID does not exist" Feb 24 10:25:40 crc kubenswrapper[4985]: I0224 10:25:40.272096 4985 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9892cb21-c7d9-4d52-ba49-3400d06437b0" path="/var/lib/kubelet/pods/9892cb21-c7d9-4d52-ba49-3400d06437b0/volumes" Feb 24 10:25:43 crc kubenswrapper[4985]: I0224 10:25:43.624714 4985 patch_prober.go:28] interesting pod/machine-config-daemon-hq52w container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 24 10:25:43 crc kubenswrapper[4985]: I0224 10:25:43.625207 4985 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hq52w" podUID="11c1c7b8-18df-4583-849f-76b62544344b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 24 10:25:43 crc kubenswrapper[4985]: I0224 10:25:43.625869 4985 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-hq52w" Feb 24 10:25:43 crc kubenswrapper[4985]: I0224 10:25:43.626837 4985 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"fb077e097f1b7751aafbb136e990ad3a2b3095f18d5ced9f1157f0ad3f65272a"} pod="openshift-machine-config-operator/machine-config-daemon-hq52w" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 24 10:25:43 crc kubenswrapper[4985]: I0224 10:25:43.626991 4985 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-hq52w" podUID="11c1c7b8-18df-4583-849f-76b62544344b" containerName="machine-config-daemon" containerID="cri-o://fb077e097f1b7751aafbb136e990ad3a2b3095f18d5ced9f1157f0ad3f65272a" gracePeriod=600 Feb 24 10:25:44 crc kubenswrapper[4985]: I0224 10:25:44.024323 4985 generic.go:334] "Generic (PLEG): container finished" podID="11c1c7b8-18df-4583-849f-76b62544344b" containerID="fb077e097f1b7751aafbb136e990ad3a2b3095f18d5ced9f1157f0ad3f65272a" exitCode=0 Feb 24 10:25:44 crc kubenswrapper[4985]: I0224 10:25:44.024353 4985 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-hq52w" event={"ID":"11c1c7b8-18df-4583-849f-76b62544344b","Type":"ContainerDied","Data":"fb077e097f1b7751aafbb136e990ad3a2b3095f18d5ced9f1157f0ad3f65272a"} Feb 24 10:25:44 crc kubenswrapper[4985]: I0224 10:25:44.024733 4985 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-hq52w" event={"ID":"11c1c7b8-18df-4583-849f-76b62544344b","Type":"ContainerStarted","Data":"328f55488a375a575b2df8dda3d88778ab9a17302f1f3c6264b6a04577b21db8"} Feb 24 10:25:44 crc kubenswrapper[4985]: I0224 10:25:44.024769 4985 scope.go:117] "RemoveContainer" containerID="5c4f0d0bca61173f77c9ab28eb284d806888a49c4136cca4d749de1d6a14995f" Feb 24 10:25:52 crc kubenswrapper[4985]: I0224 10:25:52.866336 4985 ???:1] "http: TLS handshake error from 192.168.126.11:47392: no serving certificate available for the kubelet" Feb 24 10:25:52 crc kubenswrapper[4985]: I0224 10:25:52.948235 4985 ???:1] "http: TLS handshake error from 192.168.126.11:47396: no serving certificate available for the kubelet" Feb 24 10:25:52 crc kubenswrapper[4985]: I0224 10:25:52.963669 4985 ???:1] "http: TLS handshake error from 192.168.126.11:47404: no serving certificate available for the kubelet" Feb 24 10:26:04 crc kubenswrapper[4985]: I0224 10:26:04.331002 4985 ???:1] "http: TLS handshake error from 192.168.126.11:55556: no serving certificate available for the kubelet" Feb 24 10:26:04 crc kubenswrapper[4985]: I0224 10:26:04.410020 4985 ???:1] "http: TLS handshake error from 192.168.126.11:55558: no serving certificate available for the kubelet" Feb 24 10:26:04 crc kubenswrapper[4985]: I0224 10:26:04.555480 4985 ???:1] "http: TLS handshake error from 192.168.126.11:55568: no serving certificate available for the kubelet" Feb 24 10:26:29 crc kubenswrapper[4985]: E0224 10:26:29.301428 4985 certificate_manager.go:579] "Unhandled Error" err="kubernetes.io/kubelet-serving: certificate request was not signed: timed out waiting for the condition" logger="UnhandledError" Feb 24 10:26:30 crc kubenswrapper[4985]: I0224 10:26:30.343063 4985 ???:1] "http: TLS handshake error from 192.168.126.11:44304: no serving certificate available for the kubelet" Feb 24 10:26:30 crc kubenswrapper[4985]: I0224 10:26:30.463857 4985 ???:1] "http: TLS handshake error from 192.168.126.11:44314: no serving certificate available for the kubelet" Feb 24 10:26:30 crc kubenswrapper[4985]: I0224 10:26:30.465448 4985 ???:1] "http: TLS handshake error from 192.168.126.11:44330: no serving certificate available for the kubelet" Feb 24 10:26:30 crc kubenswrapper[4985]: I0224 10:26:30.500438 4985 ???:1] "http: TLS handshake error from 192.168.126.11:44340: no serving certificate available for the kubelet" Feb 24 10:26:30 crc kubenswrapper[4985]: I0224 10:26:30.619974 4985 ???:1] "http: TLS handshake error from 192.168.126.11:44356: no serving certificate available for the kubelet" Feb 24 10:26:30 crc kubenswrapper[4985]: I0224 10:26:30.631956 4985 ???:1] "http: TLS handshake error from 192.168.126.11:44368: no serving certificate available for the kubelet" Feb 24 10:26:30 crc kubenswrapper[4985]: I0224 10:26:30.661590 4985 ???:1] "http: TLS handshake error from 192.168.126.11:44384: no serving certificate available for the kubelet" Feb 24 10:26:30 crc kubenswrapper[4985]: I0224 10:26:30.794927 4985 ???:1] "http: TLS handshake error from 192.168.126.11:44390: no serving certificate available for the kubelet" Feb 24 10:26:30 crc kubenswrapper[4985]: I0224 10:26:30.917106 4985 ???:1] "http: TLS handshake error from 192.168.126.11:44394: no serving certificate available for the kubelet" Feb 24 10:26:30 crc kubenswrapper[4985]: I0224 10:26:30.935509 4985 ???:1] "http: TLS handshake error from 192.168.126.11:44400: no serving certificate available for the kubelet" Feb 24 10:26:30 crc kubenswrapper[4985]: I0224 10:26:30.958172 4985 ???:1] "http: TLS handshake error from 192.168.126.11:44404: no serving certificate available for the kubelet" Feb 24 10:26:31 crc kubenswrapper[4985]: I0224 10:26:31.069439 4985 ???:1] "http: TLS handshake error from 192.168.126.11:44410: no serving certificate available for the kubelet" Feb 24 10:26:31 crc kubenswrapper[4985]: I0224 10:26:31.099656 4985 ???:1] "http: TLS handshake error from 192.168.126.11:44424: no serving certificate available for the kubelet" Feb 24 10:26:31 crc kubenswrapper[4985]: I0224 10:26:31.104336 4985 ???:1] "http: TLS handshake error from 192.168.126.11:44434: no serving certificate available for the kubelet" Feb 24 10:26:31 crc kubenswrapper[4985]: I0224 10:26:31.277430 4985 ???:1] "http: TLS handshake error from 192.168.126.11:44446: no serving certificate available for the kubelet" Feb 24 10:26:31 crc kubenswrapper[4985]: I0224 10:26:31.279237 4985 ???:1] "http: TLS handshake error from 192.168.126.11:44460: no serving certificate available for the kubelet" Feb 24 10:26:31 crc kubenswrapper[4985]: I0224 10:26:31.321881 4985 certificate_manager.go:356] kubernetes.io/kubelet-serving: Rotating certificates Feb 24 10:26:31 crc kubenswrapper[4985]: I0224 10:26:31.329437 4985 reflector.go:368] Caches populated for *v1.CertificateSigningRequest from k8s.io/client-go/tools/watch/informerwatcher.go:146 Feb 24 10:26:31 crc kubenswrapper[4985]: I0224 10:26:31.346134 4985 ???:1] "http: TLS handshake error from 192.168.126.11:44476: no serving certificate available for the kubelet" Feb 24 10:26:31 crc kubenswrapper[4985]: I0224 10:26:31.370540 4985 ???:1] "http: TLS handshake error from 192.168.126.11:44490: no serving certificate available for the kubelet" Feb 24 10:26:31 crc kubenswrapper[4985]: I0224 10:26:31.404435 4985 ???:1] "http: TLS handshake error from 192.168.126.11:44506: no serving certificate available for the kubelet" Feb 24 10:26:31 crc kubenswrapper[4985]: I0224 10:26:31.410768 4985 ???:1] "http: TLS handshake error from 192.168.126.11:44518: no serving certificate available for the kubelet" Feb 24 10:26:31 crc kubenswrapper[4985]: I0224 10:26:31.428965 4985 ???:1] "http: TLS handshake error from 192.168.126.11:44532: no serving certificate available for the kubelet" Feb 24 10:26:31 crc kubenswrapper[4985]: I0224 10:26:31.442994 4985 ???:1] "http: TLS handshake error from 192.168.126.11:44538: no serving certificate available for the kubelet" Feb 24 10:26:31 crc kubenswrapper[4985]: I0224 10:26:31.449650 4985 ???:1] "http: TLS handshake error from 192.168.126.11:44548: no serving certificate available for the kubelet" Feb 24 10:26:31 crc kubenswrapper[4985]: I0224 10:26:31.503546 4985 ???:1] "http: TLS handshake error from 192.168.126.11:44552: no serving certificate available for the kubelet" Feb 24 10:26:31 crc kubenswrapper[4985]: I0224 10:26:31.574403 4985 ???:1] "http: TLS handshake error from 192.168.126.11:44558: no serving certificate available for the kubelet" Feb 24 10:26:31 crc kubenswrapper[4985]: I0224 10:26:31.584765 4985 ???:1] "http: TLS handshake error from 192.168.126.11:44560: no serving certificate available for the kubelet" Feb 24 10:26:31 crc kubenswrapper[4985]: I0224 10:26:31.593948 4985 ???:1] "http: TLS handshake error from 192.168.126.11:44574: no serving certificate available for the kubelet" Feb 24 10:26:31 crc kubenswrapper[4985]: I0224 10:26:31.604972 4985 ???:1] "http: TLS handshake error from 192.168.126.11:44586: no serving certificate available for the kubelet" Feb 24 10:26:31 crc kubenswrapper[4985]: I0224 10:26:31.740402 4985 ???:1] "http: TLS handshake error from 192.168.126.11:44590: no serving certificate available for the kubelet" Feb 24 10:26:31 crc kubenswrapper[4985]: I0224 10:26:31.786266 4985 ???:1] "http: TLS handshake error from 192.168.126.11:44604: no serving certificate available for the kubelet" Feb 24 10:26:31 crc kubenswrapper[4985]: I0224 10:26:31.875394 4985 ???:1] "http: TLS handshake error from 192.168.126.11:44608: no serving certificate available for the kubelet" Feb 24 10:26:31 crc kubenswrapper[4985]: I0224 10:26:31.933272 4985 ???:1] "http: TLS handshake error from 192.168.126.11:44610: no serving certificate available for the kubelet" Feb 24 10:26:31 crc kubenswrapper[4985]: I0224 10:26:31.937033 4985 ???:1] "http: TLS handshake error from 192.168.126.11:44622: no serving certificate available for the kubelet" Feb 24 10:26:32 crc kubenswrapper[4985]: I0224 10:26:32.075063 4985 ???:1] "http: TLS handshake error from 192.168.126.11:44638: no serving certificate available for the kubelet" Feb 24 10:26:32 crc kubenswrapper[4985]: I0224 10:26:32.075276 4985 ???:1] "http: TLS handshake error from 192.168.126.11:44652: no serving certificate available for the kubelet" Feb 24 10:26:32 crc kubenswrapper[4985]: I0224 10:26:32.119138 4985 ???:1] "http: TLS handshake error from 192.168.126.11:44658: no serving certificate available for the kubelet" Feb 24 10:26:32 crc kubenswrapper[4985]: I0224 10:26:32.125542 4985 ???:1] "http: TLS handshake error from 192.168.126.11:44662: no serving certificate available for the kubelet" Feb 24 10:26:32 crc kubenswrapper[4985]: I0224 10:26:32.787408 4985 ???:1] "http: TLS handshake error from 192.168.126.11:44664: no serving certificate available for the kubelet" Feb 24 10:26:34 crc kubenswrapper[4985]: I0224 10:26:34.091846 4985 ???:1] "http: TLS handshake error from 192.168.126.11:44670: no serving certificate available for the kubelet" Feb 24 10:26:36 crc kubenswrapper[4985]: I0224 10:26:36.686972 4985 ???:1] "http: TLS handshake error from 192.168.126.11:44088: no serving certificate available for the kubelet" Feb 24 10:26:41 crc kubenswrapper[4985]: I0224 10:26:41.833571 4985 ???:1] "http: TLS handshake error from 192.168.126.11:44090: no serving certificate available for the kubelet" Feb 24 10:26:52 crc kubenswrapper[4985]: I0224 10:26:52.101402 4985 ???:1] "http: TLS handshake error from 192.168.126.11:55172: no serving certificate available for the kubelet" Feb 24 10:27:12 crc kubenswrapper[4985]: I0224 10:27:12.607022 4985 ???:1] "http: TLS handshake error from 192.168.126.11:51656: no serving certificate available for the kubelet" Feb 24 10:27:35 crc kubenswrapper[4985]: I0224 10:27:35.705858 4985 generic.go:334] "Generic (PLEG): container finished" podID="46643c34-b364-4b70-b54d-32ff88076ca8" containerID="ab16ed95cc5f093c1f1be8e501b16fe89817fc1c16e4857c4f3e0c4cf135941f" exitCode=0 Feb 24 10:27:35 crc kubenswrapper[4985]: I0224 10:27:35.705957 4985 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-jxp4q/must-gather-vsghw" event={"ID":"46643c34-b364-4b70-b54d-32ff88076ca8","Type":"ContainerDied","Data":"ab16ed95cc5f093c1f1be8e501b16fe89817fc1c16e4857c4f3e0c4cf135941f"} Feb 24 10:27:35 crc kubenswrapper[4985]: I0224 10:27:35.706987 4985 scope.go:117] "RemoveContainer" containerID="ab16ed95cc5f093c1f1be8e501b16fe89817fc1c16e4857c4f3e0c4cf135941f" Feb 24 10:27:43 crc kubenswrapper[4985]: I0224 10:27:43.625119 4985 patch_prober.go:28] interesting pod/machine-config-daemon-hq52w container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 24 10:27:43 crc kubenswrapper[4985]: I0224 10:27:43.626069 4985 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hq52w" podUID="11c1c7b8-18df-4583-849f-76b62544344b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 24 10:27:44 crc kubenswrapper[4985]: I0224 10:27:44.682584 4985 ???:1] "http: TLS handshake error from 192.168.126.11:42236: no serving certificate available for the kubelet" Feb 24 10:27:44 crc kubenswrapper[4985]: I0224 10:27:44.862744 4985 ???:1] "http: TLS handshake error from 192.168.126.11:42250: no serving certificate available for the kubelet" Feb 24 10:27:44 crc kubenswrapper[4985]: I0224 10:27:44.872483 4985 ???:1] "http: TLS handshake error from 192.168.126.11:42266: no serving certificate available for the kubelet" Feb 24 10:27:44 crc kubenswrapper[4985]: I0224 10:27:44.891939 4985 ???:1] "http: TLS handshake error from 192.168.126.11:42272: no serving certificate available for the kubelet" Feb 24 10:27:44 crc kubenswrapper[4985]: I0224 10:27:44.901167 4985 ???:1] "http: TLS handshake error from 192.168.126.11:42278: no serving certificate available for the kubelet" Feb 24 10:27:44 crc kubenswrapper[4985]: I0224 10:27:44.914369 4985 ???:1] "http: TLS handshake error from 192.168.126.11:42280: no serving certificate available for the kubelet" Feb 24 10:27:44 crc kubenswrapper[4985]: I0224 10:27:44.923101 4985 ???:1] "http: TLS handshake error from 192.168.126.11:42294: no serving certificate available for the kubelet" Feb 24 10:27:44 crc kubenswrapper[4985]: I0224 10:27:44.934369 4985 ???:1] "http: TLS handshake error from 192.168.126.11:42306: no serving certificate available for the kubelet" Feb 24 10:27:44 crc kubenswrapper[4985]: I0224 10:27:44.943133 4985 ???:1] "http: TLS handshake error from 192.168.126.11:42314: no serving certificate available for the kubelet" Feb 24 10:27:45 crc kubenswrapper[4985]: I0224 10:27:45.081073 4985 ???:1] "http: TLS handshake error from 192.168.126.11:42324: no serving certificate available for the kubelet" Feb 24 10:27:45 crc kubenswrapper[4985]: I0224 10:27:45.088904 4985 ???:1] "http: TLS handshake error from 192.168.126.11:42330: no serving certificate available for the kubelet" Feb 24 10:27:45 crc kubenswrapper[4985]: I0224 10:27:45.103387 4985 ???:1] "http: TLS handshake error from 192.168.126.11:42336: no serving certificate available for the kubelet" Feb 24 10:27:45 crc kubenswrapper[4985]: I0224 10:27:45.111007 4985 ???:1] "http: TLS handshake error from 192.168.126.11:42350: no serving certificate available for the kubelet" Feb 24 10:27:45 crc kubenswrapper[4985]: I0224 10:27:45.124097 4985 ???:1] "http: TLS handshake error from 192.168.126.11:42366: no serving certificate available for the kubelet" Feb 24 10:27:45 crc kubenswrapper[4985]: I0224 10:27:45.133311 4985 ???:1] "http: TLS handshake error from 192.168.126.11:42368: no serving certificate available for the kubelet" Feb 24 10:27:45 crc kubenswrapper[4985]: I0224 10:27:45.144007 4985 ???:1] "http: TLS handshake error from 192.168.126.11:42370: no serving certificate available for the kubelet" Feb 24 10:27:45 crc kubenswrapper[4985]: I0224 10:27:45.152161 4985 ???:1] "http: TLS handshake error from 192.168.126.11:42380: no serving certificate available for the kubelet" Feb 24 10:27:50 crc kubenswrapper[4985]: I0224 10:27:50.470975 4985 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-jxp4q/must-gather-vsghw"] Feb 24 10:27:50 crc kubenswrapper[4985]: I0224 10:27:50.471845 4985 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-must-gather-jxp4q/must-gather-vsghw" podUID="46643c34-b364-4b70-b54d-32ff88076ca8" containerName="copy" containerID="cri-o://49a3cc684519663176a07abd40ca2c07b1cd74e934e04fc4f899f6c07451a09c" gracePeriod=2 Feb 24 10:27:50 crc kubenswrapper[4985]: I0224 10:27:50.480235 4985 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-jxp4q/must-gather-vsghw"] Feb 24 10:27:50 crc kubenswrapper[4985]: I0224 10:27:50.807195 4985 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-jxp4q_must-gather-vsghw_46643c34-b364-4b70-b54d-32ff88076ca8/copy/0.log" Feb 24 10:27:50 crc kubenswrapper[4985]: I0224 10:27:50.807821 4985 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-jxp4q/must-gather-vsghw" Feb 24 10:27:50 crc kubenswrapper[4985]: I0224 10:27:50.809976 4985 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-jxp4q_must-gather-vsghw_46643c34-b364-4b70-b54d-32ff88076ca8/copy/0.log" Feb 24 10:27:50 crc kubenswrapper[4985]: I0224 10:27:50.810296 4985 generic.go:334] "Generic (PLEG): container finished" podID="46643c34-b364-4b70-b54d-32ff88076ca8" containerID="49a3cc684519663176a07abd40ca2c07b1cd74e934e04fc4f899f6c07451a09c" exitCode=143 Feb 24 10:27:50 crc kubenswrapper[4985]: I0224 10:27:50.810351 4985 scope.go:117] "RemoveContainer" containerID="49a3cc684519663176a07abd40ca2c07b1cd74e934e04fc4f899f6c07451a09c" Feb 24 10:27:50 crc kubenswrapper[4985]: I0224 10:27:50.810470 4985 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-jxp4q/must-gather-vsghw" Feb 24 10:27:50 crc kubenswrapper[4985]: I0224 10:27:50.829583 4985 scope.go:117] "RemoveContainer" containerID="ab16ed95cc5f093c1f1be8e501b16fe89817fc1c16e4857c4f3e0c4cf135941f" Feb 24 10:27:50 crc kubenswrapper[4985]: I0224 10:27:50.868495 4985 scope.go:117] "RemoveContainer" containerID="49a3cc684519663176a07abd40ca2c07b1cd74e934e04fc4f899f6c07451a09c" Feb 24 10:27:50 crc kubenswrapper[4985]: E0224 10:27:50.868995 4985 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"49a3cc684519663176a07abd40ca2c07b1cd74e934e04fc4f899f6c07451a09c\": container with ID starting with 49a3cc684519663176a07abd40ca2c07b1cd74e934e04fc4f899f6c07451a09c not found: ID does not exist" containerID="49a3cc684519663176a07abd40ca2c07b1cd74e934e04fc4f899f6c07451a09c" Feb 24 10:27:50 crc kubenswrapper[4985]: I0224 10:27:50.869038 4985 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"49a3cc684519663176a07abd40ca2c07b1cd74e934e04fc4f899f6c07451a09c"} err="failed to get container status \"49a3cc684519663176a07abd40ca2c07b1cd74e934e04fc4f899f6c07451a09c\": rpc error: code = NotFound desc = could not find container \"49a3cc684519663176a07abd40ca2c07b1cd74e934e04fc4f899f6c07451a09c\": container with ID starting with 49a3cc684519663176a07abd40ca2c07b1cd74e934e04fc4f899f6c07451a09c not found: ID does not exist" Feb 24 10:27:50 crc kubenswrapper[4985]: I0224 10:27:50.869064 4985 scope.go:117] "RemoveContainer" containerID="ab16ed95cc5f093c1f1be8e501b16fe89817fc1c16e4857c4f3e0c4cf135941f" Feb 24 10:27:50 crc kubenswrapper[4985]: E0224 10:27:50.869527 4985 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ab16ed95cc5f093c1f1be8e501b16fe89817fc1c16e4857c4f3e0c4cf135941f\": container with ID starting with ab16ed95cc5f093c1f1be8e501b16fe89817fc1c16e4857c4f3e0c4cf135941f not found: ID does not exist" containerID="ab16ed95cc5f093c1f1be8e501b16fe89817fc1c16e4857c4f3e0c4cf135941f" Feb 24 10:27:50 crc kubenswrapper[4985]: I0224 10:27:50.869568 4985 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ab16ed95cc5f093c1f1be8e501b16fe89817fc1c16e4857c4f3e0c4cf135941f"} err="failed to get container status \"ab16ed95cc5f093c1f1be8e501b16fe89817fc1c16e4857c4f3e0c4cf135941f\": rpc error: code = NotFound desc = could not find container \"ab16ed95cc5f093c1f1be8e501b16fe89817fc1c16e4857c4f3e0c4cf135941f\": container with ID starting with ab16ed95cc5f093c1f1be8e501b16fe89817fc1c16e4857c4f3e0c4cf135941f not found: ID does not exist" Feb 24 10:27:50 crc kubenswrapper[4985]: I0224 10:27:50.973193 4985 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4wrqq\" (UniqueName: \"kubernetes.io/projected/46643c34-b364-4b70-b54d-32ff88076ca8-kube-api-access-4wrqq\") pod \"46643c34-b364-4b70-b54d-32ff88076ca8\" (UID: \"46643c34-b364-4b70-b54d-32ff88076ca8\") " Feb 24 10:27:50 crc kubenswrapper[4985]: I0224 10:27:50.973400 4985 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/46643c34-b364-4b70-b54d-32ff88076ca8-must-gather-output\") pod \"46643c34-b364-4b70-b54d-32ff88076ca8\" (UID: \"46643c34-b364-4b70-b54d-32ff88076ca8\") " Feb 24 10:27:50 crc kubenswrapper[4985]: I0224 10:27:50.977972 4985 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/46643c34-b364-4b70-b54d-32ff88076ca8-kube-api-access-4wrqq" (OuterVolumeSpecName: "kube-api-access-4wrqq") pod "46643c34-b364-4b70-b54d-32ff88076ca8" (UID: "46643c34-b364-4b70-b54d-32ff88076ca8"). InnerVolumeSpecName "kube-api-access-4wrqq". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 24 10:27:51 crc kubenswrapper[4985]: I0224 10:27:51.031391 4985 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/46643c34-b364-4b70-b54d-32ff88076ca8-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "46643c34-b364-4b70-b54d-32ff88076ca8" (UID: "46643c34-b364-4b70-b54d-32ff88076ca8"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 24 10:27:51 crc kubenswrapper[4985]: I0224 10:27:51.074633 4985 reconciler_common.go:293] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/46643c34-b364-4b70-b54d-32ff88076ca8-must-gather-output\") on node \"crc\" DevicePath \"\"" Feb 24 10:27:51 crc kubenswrapper[4985]: I0224 10:27:51.074662 4985 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4wrqq\" (UniqueName: \"kubernetes.io/projected/46643c34-b364-4b70-b54d-32ff88076ca8-kube-api-access-4wrqq\") on node \"crc\" DevicePath \"\"" Feb 24 10:27:52 crc kubenswrapper[4985]: I0224 10:27:52.276405 4985 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="46643c34-b364-4b70-b54d-32ff88076ca8" path="/var/lib/kubelet/pods/46643c34-b364-4b70-b54d-32ff88076ca8/volumes" Feb 24 10:27:53 crc kubenswrapper[4985]: I0224 10:27:53.604472 4985 ???:1] "http: TLS handshake error from 192.168.126.11:42386: no serving certificate available for the kubelet" Feb 24 10:28:13 crc kubenswrapper[4985]: I0224 10:28:13.625254 4985 patch_prober.go:28] interesting pod/machine-config-daemon-hq52w container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 24 10:28:13 crc kubenswrapper[4985]: I0224 10:28:13.626059 4985 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hq52w" podUID="11c1c7b8-18df-4583-849f-76b62544344b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 24 10:28:43 crc kubenswrapper[4985]: I0224 10:28:43.625071 4985 patch_prober.go:28] interesting pod/machine-config-daemon-hq52w container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 24 10:28:43 crc kubenswrapper[4985]: I0224 10:28:43.625517 4985 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hq52w" podUID="11c1c7b8-18df-4583-849f-76b62544344b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 24 10:28:43 crc kubenswrapper[4985]: I0224 10:28:43.625570 4985 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-hq52w" Feb 24 10:28:43 crc kubenswrapper[4985]: I0224 10:28:43.626193 4985 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"328f55488a375a575b2df8dda3d88778ab9a17302f1f3c6264b6a04577b21db8"} pod="openshift-machine-config-operator/machine-config-daemon-hq52w" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 24 10:28:43 crc kubenswrapper[4985]: I0224 10:28:43.626254 4985 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-hq52w" podUID="11c1c7b8-18df-4583-849f-76b62544344b" containerName="machine-config-daemon" containerID="cri-o://328f55488a375a575b2df8dda3d88778ab9a17302f1f3c6264b6a04577b21db8" gracePeriod=600 Feb 24 10:28:44 crc kubenswrapper[4985]: I0224 10:28:44.178948 4985 generic.go:334] "Generic (PLEG): container finished" podID="11c1c7b8-18df-4583-849f-76b62544344b" containerID="328f55488a375a575b2df8dda3d88778ab9a17302f1f3c6264b6a04577b21db8" exitCode=0 Feb 24 10:28:44 crc kubenswrapper[4985]: I0224 10:28:44.179207 4985 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-hq52w" event={"ID":"11c1c7b8-18df-4583-849f-76b62544344b","Type":"ContainerDied","Data":"328f55488a375a575b2df8dda3d88778ab9a17302f1f3c6264b6a04577b21db8"} Feb 24 10:28:44 crc kubenswrapper[4985]: I0224 10:28:44.179232 4985 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-hq52w" event={"ID":"11c1c7b8-18df-4583-849f-76b62544344b","Type":"ContainerStarted","Data":"cb1afe3c9a28dc30a6f10375ba52167b399cfd5d97e05dcbc450b811c39a9073"} Feb 24 10:28:44 crc kubenswrapper[4985]: I0224 10:28:44.179249 4985 scope.go:117] "RemoveContainer" containerID="fb077e097f1b7751aafbb136e990ad3a2b3095f18d5ced9f1157f0ad3f65272a" Feb 24 10:29:15 crc kubenswrapper[4985]: I0224 10:29:15.555048 4985 ???:1] "http: TLS handshake error from 192.168.126.11:41672: no serving certificate available for the kubelet" Feb 24 10:30:00 crc kubenswrapper[4985]: I0224 10:30:00.137074 4985 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29532150-kd8ll"] Feb 24 10:30:00 crc kubenswrapper[4985]: E0224 10:30:00.138060 4985 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="46643c34-b364-4b70-b54d-32ff88076ca8" containerName="copy" Feb 24 10:30:00 crc kubenswrapper[4985]: I0224 10:30:00.138080 4985 state_mem.go:107] "Deleted CPUSet assignment" podUID="46643c34-b364-4b70-b54d-32ff88076ca8" containerName="copy" Feb 24 10:30:00 crc kubenswrapper[4985]: E0224 10:30:00.138094 4985 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9892cb21-c7d9-4d52-ba49-3400d06437b0" containerName="extract-utilities" Feb 24 10:30:00 crc kubenswrapper[4985]: I0224 10:30:00.138104 4985 state_mem.go:107] "Deleted CPUSet assignment" podUID="9892cb21-c7d9-4d52-ba49-3400d06437b0" containerName="extract-utilities" Feb 24 10:30:00 crc kubenswrapper[4985]: E0224 10:30:00.138121 4985 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e5a3df9e-8835-48e1-ab21-bf367b97f7dd" containerName="registry-server" Feb 24 10:30:00 crc kubenswrapper[4985]: I0224 10:30:00.138134 4985 state_mem.go:107] "Deleted CPUSet assignment" podUID="e5a3df9e-8835-48e1-ab21-bf367b97f7dd" containerName="registry-server" Feb 24 10:30:00 crc kubenswrapper[4985]: E0224 10:30:00.138149 4985 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9892cb21-c7d9-4d52-ba49-3400d06437b0" containerName="extract-content" Feb 24 10:30:00 crc kubenswrapper[4985]: I0224 10:30:00.138159 4985 state_mem.go:107] "Deleted CPUSet assignment" podUID="9892cb21-c7d9-4d52-ba49-3400d06437b0" containerName="extract-content" Feb 24 10:30:00 crc kubenswrapper[4985]: E0224 10:30:00.138180 4985 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e5a3df9e-8835-48e1-ab21-bf367b97f7dd" containerName="extract-utilities" Feb 24 10:30:00 crc kubenswrapper[4985]: I0224 10:30:00.138200 4985 state_mem.go:107] "Deleted CPUSet assignment" podUID="e5a3df9e-8835-48e1-ab21-bf367b97f7dd" containerName="extract-utilities" Feb 24 10:30:00 crc kubenswrapper[4985]: E0224 10:30:00.138220 4985 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e5a3df9e-8835-48e1-ab21-bf367b97f7dd" containerName="extract-content" Feb 24 10:30:00 crc kubenswrapper[4985]: I0224 10:30:00.138230 4985 state_mem.go:107] "Deleted CPUSet assignment" podUID="e5a3df9e-8835-48e1-ab21-bf367b97f7dd" containerName="extract-content" Feb 24 10:30:00 crc kubenswrapper[4985]: E0224 10:30:00.138247 4985 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1c169a23-e0e7-4424-9589-4efdf739605f" containerName="extract-utilities" Feb 24 10:30:00 crc kubenswrapper[4985]: I0224 10:30:00.138258 4985 state_mem.go:107] "Deleted CPUSet assignment" podUID="1c169a23-e0e7-4424-9589-4efdf739605f" containerName="extract-utilities" Feb 24 10:30:00 crc kubenswrapper[4985]: E0224 10:30:00.138274 4985 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9892cb21-c7d9-4d52-ba49-3400d06437b0" containerName="registry-server" Feb 24 10:30:00 crc kubenswrapper[4985]: I0224 10:30:00.138284 4985 state_mem.go:107] "Deleted CPUSet assignment" podUID="9892cb21-c7d9-4d52-ba49-3400d06437b0" containerName="registry-server" Feb 24 10:30:00 crc kubenswrapper[4985]: E0224 10:30:00.138302 4985 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="46643c34-b364-4b70-b54d-32ff88076ca8" containerName="gather" Feb 24 10:30:00 crc kubenswrapper[4985]: I0224 10:30:00.138312 4985 state_mem.go:107] "Deleted CPUSet assignment" podUID="46643c34-b364-4b70-b54d-32ff88076ca8" containerName="gather" Feb 24 10:30:00 crc kubenswrapper[4985]: E0224 10:30:00.138328 4985 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1c169a23-e0e7-4424-9589-4efdf739605f" containerName="registry-server" Feb 24 10:30:00 crc kubenswrapper[4985]: I0224 10:30:00.138338 4985 state_mem.go:107] "Deleted CPUSet assignment" podUID="1c169a23-e0e7-4424-9589-4efdf739605f" containerName="registry-server" Feb 24 10:30:00 crc kubenswrapper[4985]: E0224 10:30:00.138356 4985 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1c169a23-e0e7-4424-9589-4efdf739605f" containerName="extract-content" Feb 24 10:30:00 crc kubenswrapper[4985]: I0224 10:30:00.138365 4985 state_mem.go:107] "Deleted CPUSet assignment" podUID="1c169a23-e0e7-4424-9589-4efdf739605f" containerName="extract-content" Feb 24 10:30:00 crc kubenswrapper[4985]: I0224 10:30:00.138504 4985 memory_manager.go:354] "RemoveStaleState removing state" podUID="9892cb21-c7d9-4d52-ba49-3400d06437b0" containerName="registry-server" Feb 24 10:30:00 crc kubenswrapper[4985]: I0224 10:30:00.138527 4985 memory_manager.go:354] "RemoveStaleState removing state" podUID="e5a3df9e-8835-48e1-ab21-bf367b97f7dd" containerName="registry-server" Feb 24 10:30:00 crc kubenswrapper[4985]: I0224 10:30:00.138545 4985 memory_manager.go:354] "RemoveStaleState removing state" podUID="46643c34-b364-4b70-b54d-32ff88076ca8" containerName="gather" Feb 24 10:30:00 crc kubenswrapper[4985]: I0224 10:30:00.138562 4985 memory_manager.go:354] "RemoveStaleState removing state" podUID="46643c34-b364-4b70-b54d-32ff88076ca8" containerName="copy" Feb 24 10:30:00 crc kubenswrapper[4985]: I0224 10:30:00.138576 4985 memory_manager.go:354] "RemoveStaleState removing state" podUID="1c169a23-e0e7-4424-9589-4efdf739605f" containerName="registry-server" Feb 24 10:30:00 crc kubenswrapper[4985]: I0224 10:30:00.139172 4985 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29532150-kd8ll" Feb 24 10:30:00 crc kubenswrapper[4985]: I0224 10:30:00.149643 4985 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Feb 24 10:30:00 crc kubenswrapper[4985]: I0224 10:30:00.153148 4985 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29532150-kd8ll"] Feb 24 10:30:00 crc kubenswrapper[4985]: I0224 10:30:00.155438 4985 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Feb 24 10:30:00 crc kubenswrapper[4985]: I0224 10:30:00.303829 4985 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/deb87fac-8480-4119-b8d4-4665722c02d4-config-volume\") pod \"collect-profiles-29532150-kd8ll\" (UID: \"deb87fac-8480-4119-b8d4-4665722c02d4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29532150-kd8ll" Feb 24 10:30:00 crc kubenswrapper[4985]: I0224 10:30:00.303905 4985 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tg42p\" (UniqueName: \"kubernetes.io/projected/deb87fac-8480-4119-b8d4-4665722c02d4-kube-api-access-tg42p\") pod \"collect-profiles-29532150-kd8ll\" (UID: \"deb87fac-8480-4119-b8d4-4665722c02d4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29532150-kd8ll" Feb 24 10:30:00 crc kubenswrapper[4985]: I0224 10:30:00.303963 4985 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/deb87fac-8480-4119-b8d4-4665722c02d4-secret-volume\") pod \"collect-profiles-29532150-kd8ll\" (UID: \"deb87fac-8480-4119-b8d4-4665722c02d4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29532150-kd8ll" Feb 24 10:30:00 crc kubenswrapper[4985]: I0224 10:30:00.404544 4985 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/deb87fac-8480-4119-b8d4-4665722c02d4-secret-volume\") pod \"collect-profiles-29532150-kd8ll\" (UID: \"deb87fac-8480-4119-b8d4-4665722c02d4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29532150-kd8ll" Feb 24 10:30:00 crc kubenswrapper[4985]: I0224 10:30:00.404871 4985 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/deb87fac-8480-4119-b8d4-4665722c02d4-config-volume\") pod \"collect-profiles-29532150-kd8ll\" (UID: \"deb87fac-8480-4119-b8d4-4665722c02d4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29532150-kd8ll" Feb 24 10:30:00 crc kubenswrapper[4985]: I0224 10:30:00.404989 4985 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tg42p\" (UniqueName: \"kubernetes.io/projected/deb87fac-8480-4119-b8d4-4665722c02d4-kube-api-access-tg42p\") pod \"collect-profiles-29532150-kd8ll\" (UID: \"deb87fac-8480-4119-b8d4-4665722c02d4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29532150-kd8ll" Feb 24 10:30:00 crc kubenswrapper[4985]: I0224 10:30:00.405695 4985 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/deb87fac-8480-4119-b8d4-4665722c02d4-config-volume\") pod \"collect-profiles-29532150-kd8ll\" (UID: \"deb87fac-8480-4119-b8d4-4665722c02d4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29532150-kd8ll" Feb 24 10:30:00 crc kubenswrapper[4985]: I0224 10:30:00.409852 4985 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/deb87fac-8480-4119-b8d4-4665722c02d4-secret-volume\") pod \"collect-profiles-29532150-kd8ll\" (UID: \"deb87fac-8480-4119-b8d4-4665722c02d4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29532150-kd8ll" Feb 24 10:30:00 crc kubenswrapper[4985]: I0224 10:30:00.424614 4985 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tg42p\" (UniqueName: \"kubernetes.io/projected/deb87fac-8480-4119-b8d4-4665722c02d4-kube-api-access-tg42p\") pod \"collect-profiles-29532150-kd8ll\" (UID: \"deb87fac-8480-4119-b8d4-4665722c02d4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29532150-kd8ll" Feb 24 10:30:00 crc kubenswrapper[4985]: I0224 10:30:00.475498 4985 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29532150-kd8ll" Feb 24 10:30:00 crc kubenswrapper[4985]: I0224 10:30:00.677985 4985 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29532150-kd8ll"] Feb 24 10:30:00 crc kubenswrapper[4985]: I0224 10:30:00.691344 4985 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29532150-kd8ll" event={"ID":"deb87fac-8480-4119-b8d4-4665722c02d4","Type":"ContainerStarted","Data":"33b502ee22e4c64b5edef867eeef4e2d6cbdab368ad6f9aa978f71a586f09a45"} Feb 24 10:30:01 crc kubenswrapper[4985]: I0224 10:30:01.700350 4985 generic.go:334] "Generic (PLEG): container finished" podID="deb87fac-8480-4119-b8d4-4665722c02d4" containerID="a272c80544861b188f064215256bd6829e4e9a7731b230174f44d73ea8b6a441" exitCode=0 Feb 24 10:30:01 crc kubenswrapper[4985]: I0224 10:30:01.700408 4985 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29532150-kd8ll" event={"ID":"deb87fac-8480-4119-b8d4-4665722c02d4","Type":"ContainerDied","Data":"a272c80544861b188f064215256bd6829e4e9a7731b230174f44d73ea8b6a441"} Feb 24 10:30:02 crc kubenswrapper[4985]: I0224 10:30:02.924797 4985 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29532150-kd8ll" Feb 24 10:30:03 crc kubenswrapper[4985]: I0224 10:30:03.037858 4985 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tg42p\" (UniqueName: \"kubernetes.io/projected/deb87fac-8480-4119-b8d4-4665722c02d4-kube-api-access-tg42p\") pod \"deb87fac-8480-4119-b8d4-4665722c02d4\" (UID: \"deb87fac-8480-4119-b8d4-4665722c02d4\") " Feb 24 10:30:03 crc kubenswrapper[4985]: I0224 10:30:03.038028 4985 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/deb87fac-8480-4119-b8d4-4665722c02d4-config-volume\") pod \"deb87fac-8480-4119-b8d4-4665722c02d4\" (UID: \"deb87fac-8480-4119-b8d4-4665722c02d4\") " Feb 24 10:30:03 crc kubenswrapper[4985]: I0224 10:30:03.038077 4985 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/deb87fac-8480-4119-b8d4-4665722c02d4-secret-volume\") pod \"deb87fac-8480-4119-b8d4-4665722c02d4\" (UID: \"deb87fac-8480-4119-b8d4-4665722c02d4\") " Feb 24 10:30:03 crc kubenswrapper[4985]: I0224 10:30:03.038998 4985 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/deb87fac-8480-4119-b8d4-4665722c02d4-config-volume" (OuterVolumeSpecName: "config-volume") pod "deb87fac-8480-4119-b8d4-4665722c02d4" (UID: "deb87fac-8480-4119-b8d4-4665722c02d4"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 24 10:30:03 crc kubenswrapper[4985]: I0224 10:30:03.043083 4985 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/deb87fac-8480-4119-b8d4-4665722c02d4-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "deb87fac-8480-4119-b8d4-4665722c02d4" (UID: "deb87fac-8480-4119-b8d4-4665722c02d4"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 24 10:30:03 crc kubenswrapper[4985]: I0224 10:30:03.044363 4985 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/deb87fac-8480-4119-b8d4-4665722c02d4-kube-api-access-tg42p" (OuterVolumeSpecName: "kube-api-access-tg42p") pod "deb87fac-8480-4119-b8d4-4665722c02d4" (UID: "deb87fac-8480-4119-b8d4-4665722c02d4"). InnerVolumeSpecName "kube-api-access-tg42p". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 24 10:30:03 crc kubenswrapper[4985]: I0224 10:30:03.139872 4985 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/deb87fac-8480-4119-b8d4-4665722c02d4-config-volume\") on node \"crc\" DevicePath \"\"" Feb 24 10:30:03 crc kubenswrapper[4985]: I0224 10:30:03.139955 4985 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/deb87fac-8480-4119-b8d4-4665722c02d4-secret-volume\") on node \"crc\" DevicePath \"\"" Feb 24 10:30:03 crc kubenswrapper[4985]: I0224 10:30:03.139974 4985 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tg42p\" (UniqueName: \"kubernetes.io/projected/deb87fac-8480-4119-b8d4-4665722c02d4-kube-api-access-tg42p\") on node \"crc\" DevicePath \"\"" Feb 24 10:30:03 crc kubenswrapper[4985]: I0224 10:30:03.712598 4985 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29532150-kd8ll" event={"ID":"deb87fac-8480-4119-b8d4-4665722c02d4","Type":"ContainerDied","Data":"33b502ee22e4c64b5edef867eeef4e2d6cbdab368ad6f9aa978f71a586f09a45"} Feb 24 10:30:03 crc kubenswrapper[4985]: I0224 10:30:03.712657 4985 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="33b502ee22e4c64b5edef867eeef4e2d6cbdab368ad6f9aa978f71a586f09a45" Feb 24 10:30:03 crc kubenswrapper[4985]: I0224 10:30:03.712690 4985 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29532150-kd8ll" var/home/core/zuul-output/logs/crc-cloud-workdir-crc-all-logs.tar.gz0000644000175000000000000000005515147276720024460 0ustar coreroot  Om77'(var/home/core/zuul-output/logs/crc-cloud/0000755000175000000000000000000015147276720017375 5ustar corerootvar/home/core/zuul-output/artifacts/0000755000175000017500000000000015147273724016521 5ustar corecorevar/home/core/zuul-output/docs/0000755000175000017500000000000015147273724015471 5ustar corecore